Connect with us


Tutoring firm settles claim alleging its recruiting algorithm screened out applicants over 60



An e-learning company has agreed to pay $365,000 to settle a lawsuit alleging its recruiting software’s algorithm screened out older applicants, in violation of federal nondiscrimination law.

The U.S. Equal Employment Opportunity Commission sued iTutorGroup last year, claiming it programmed its hiring platform to automatically reject female applicants age 55 or older and male applicants age 60 or older. The software rejected more than 200 tutor applicants on these bases, according to EEOC.

In the agency’s complaint, it said an applicant applied for a job using her real birth date and was immediately rejected. She reapplied about a day later with identical information but used a more recent date of birth and was offered an interview.

In the settlement agreement, iTutorGroup denied the allegations and said it specifically disputes the claim that its tutors are employees covered by the ADEA. Instead, they were independent contractors, it said.

The agreement requires the company to post notice about the settlement, devise non-discrimination and complaint policies, train its managers, and submit reports to the EEOC, in addition to the monetary relief.

EEOC cautions against bias in AI, algorithmic tools

The EEOC has in recent months cautioned employers about such screening tools.

As alleged in the iTutorGroup lawsuit, AI and algorithmic tools may perpetuate age and sex bias, stakeholders told the agency during a listening session last year; the same goes for race discrimination, they said.

The commission has similarly warned employers that the tools could screen out workers with disabilities — not only because of algorithms but also due to inaccessibility. 

Employers should conduct an ongoing self-analysis to determine whether they are using technology in a way that could result in discrimination, the agency’s chair, Charlotte A. Burrows, said in a May statement announcing a technical assistance document addressing potential bias in software, algorithms and AI employment tools. The guide cautioned employers that they may be liable for discrimination caused by such tools, even if they weren’t involved in the design.

That means HR pros should at least ask vendors whether a tool has been evaluated and whether it selects individuals from protected classes at a lower rate than others, EEOC said. “Further, if the vendor is incorrect about its own assessment and the tool does result in either disparate impact discrimination or disparate treatment discrimination, the employer could still be liable,” it cautioned.

Read the full article here