A new law went into effect in New York City on Wednesday that requires any business using A.I. in its hiring to submit that software to an audit to prove that it does not result in racist or sexist…
A new law went into effect in New York City on Wednesday that requires any business using A.I. in its hiring to submit that software to an audit to prove that it does not result in racist or sexist…
I was a recruiter previously - I’ve since thankfully gotten out of the profession. But ATS systems, and any training data they throw off for an AI to model, are going to be built on job order and job description keywords, and how they match up to CV writing. That data will be biased, just as an ATS is biased, toward those people who write more keyword-dense documents with less outside content.
If one race or one sex is better at writing focused and keyword-dense content, that’s not something that the software should be blamed for. If the AI is looking for something more advanced than keywords as compared to job-order-listed BFOQs, I take serious issue.