Communicating the sources of gender bias in AI


Published January 8, 2021 | By Agenda C


Artificial Intelligence algorithms for recruitment are learning bias from humans who don’t even realise they are discriminating.

Agenda C is proud to have worked with UniBank and The University of Melbourne to develop a research question that would add to the academic understanding of the sources of gender bias in AI and on a successful public relations campaign to highlight this ground-breaking research.

Human recruiters are unconsciously creating barriers for women looking for employment in male-dominated professions like finance and data analysis. These decisions inform the artificial intelligence algorithms used by many companies and recruitment platforms to filter for promising CVs.

Associate Professor Leah Ruppanner, one of the lead authors of the study, is featured here on Australian Broadcasting Corporation (ABC) News. “There is something distinct about the men’s resumes that made our panel rank them higher, beyond experience, qualification and education … This forms the most alarming dimension of gender bias, as we are not capturing what gives men the edge in these positions.”

ABC News: https://www.abc.net.au/news/2020-12-02/job-recruitment-algorithms-can-have-bias-against-women/12938870