In the study, when researchers from the US-based University of Washington (UW) asked for clarification on the classifications, the system showed biased perceptions about disabled people.

For example, he stated that a resume with an autism leadership award had "less emphasis on leadership roles."

However, when the researchers personalized the tool with written instructions instructing it not to be ableist, the tool reduced this bias for all but one of the disabilities assessed.

"Five of the six involved disabilities, blindness, cerebral palsy, autism, and the general term 'disability,' but only three were ranked higher than resumes that did not mention disability," the researchers noted.

The researchers used the publicly available CV of one of the study's authors, which spanned about 10 pages. They then created six modified CVs, each suggesting a different disability by adding four disability-related credentials: a scholarship, an award, a position on a diversity, equity and inclusion (DEI) panel, and membership in a student organization. .

The researchers then used ChatG's GPT-4 model to compare these modified CVs to the original version for a real "student researcher" position at a major US-based software company.

They performed each comparison 10 times; Of the 60 trials, the system ranked the improved CVs, which were identical except for the implied disability, first only a quarter of the time.

"Some of the GPT descriptions would color a person's entire resume based on their disability and claim that participation in DEI or disability potentially detracts from other parts of the resume," said Kate Glazko, a doctoral student at the Paul School G. Allen of the University of Washington. of Engineering and Computer Science.

"People need to be aware of system biases when using AI for these real-world tasks. Otherwise, a recruiter using ChatG cannot make these corrections, or be aware that even with instructions, bias can persist," he added.