I thought I’d heard it all. Yet, this was something I hadn’t anticipated. The problem wasn’t a new bias, it was an age-old bias presented in a new, and frankly, insidious way.
Unconscious Bias, the undetected biases we all have that impact our decision making and our understanding of other groups, is now infecting some of our newest technologies.
For example, research is telling us those prototype driverless cars are having a harder time “seeing” those with darker skin tones. In fact, the general accuracy of the system decreased by five percent when it was shown darker-skinned images. When the cars saw real pedestrians, the driverless system performed poorly with those people who had darker skin tones.
The issue hasn’t just been about race. Almost two years ago, Amazon had to shut down a Human Resources artificial intelligence program. The program taught itself to be sexist as it sorted through job applications. The program was modeling itself on the hiring history of the previous ten years – the biased hiring history, as it turns out – and well, a sexist artificial intelligence was the result.
The true issue here is not about racist or sexist programs. The issue is that these incidents occur because computer programmers, who are still overwhelmingly white and male, are basing some of their work on norms formed by their own unconscious biases.
Data U.S. research shows that in 2016, 77.5%. percent of programmers were white and 70 % of programmers were white male. The remaining 22.5 percent make up all the other racial/ethnic groups combined. Because of this deep lack of diversity and their unconscious biases, the programmers are unwittingly using themselves as the norm as they write code. Further, they didn’t recognize what they had done until they tested their work and discovered they had left out a huge swath of humanity when they designed their program.
This inadvertently programmed unconscious bias is highly disconcerting. But it is also something that can be fixed. Programmers are becoming more aware of the impact of their unconscious biases and they are working harder to keep them from informing their coding. If only our unconscious biases could become apparent to us and the fixes that clear.