Lack of STEM diversity is causing AI to have a ‘white male’ bias

Ryan Daws is a senior editor at TechForge Media with over a decade of experience in crafting compelling narratives and making complex topics accessible. His articles and interviews with industry leaders have earned him recognition as a key influencer by organisations like Onalytica. Under his leadership, publications have been praised by analyst firms such as Forrester for their excellence and performance. Connect with him on X (@gadget_ry) or Mastodon (

A report from New York University’s AI Now Institute has found a predominantly white male coding workforce is causing bias in algorithms.

The report highlights that – while gradually narrowing – the lack of diverse representation at major technology companies such as Microsoft, Google, and Facebook is causing AIs to cater more towards white males.

For example, at Facebook just 15 percent of the company’s AI staff are women. The problem is even more substantial at Google where just 10 percent are female.

Report authors Sarah Myers West, Meredith Whittaker and Kate Crawford wrote:

“To date, the diversity problems of the AI industry and the issues of bias in the systems it builds have tended to be considered separately.

We suggest that these are two versions of the same problem: issues of discrimination in the workforce and in system building are deeply intertwined.”

As artificial intelligence becomes used more across society, there’s a danger of some groups being left behind from its advantages while “reinforcing a narrow idea of the ‘normal’ person”.

The researchers highlight examples of where this is already happening:

  • Amazon’s controversial Rekognition facial recognition AI struggled with dark-skin females in particular, although separate analysis has found other AIs also face such difficulties with non-white males.
  • A résumé-scanning AI which relied on previous examples of successful applicants as a benchmark. The AI downgraded people who included “women’s” in their résumé or who attended women’s colleges.

AI is currently being deployed in few life-changing areas, but that’s rapidly changing. Law enforcement is already looking to use the technology for identifying criminals, even preemptively in some cases, and for making sentencing decisions – including whether someone should be granted bail.

“The use of AI systems for the classification, detection, and prediction of race and gender is in urgent need of re-evaluation,” the researchers noted. “The commercial deployment of these tools is cause for deep concern.”

Interested in hearing industry leaders discuss subjects like this and their use cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, Blockchain Expo, and Cyber Security & Cloud Expo.

Tags: , , , , , , , , ,

View Comments
Leave a comment

Leave a Reply