春水堂视频

Aug. 23, 2023

Law professor explores racial bias implications in facial recognition technology

Gideon Christian receives funding from Office of the Privacy Commissioner of Canada
Facial Recognition Technology illustration
Facial recognition technology illustration. iStock

We live in an era marked by rapid technological advancements that promise to make our lives easier and more efficient. Artificial intelligence (AI), algorithms and facial recognition have immense potential to positively transform our lives; in fact, they are used daily by millions around the world.听

But these technologies are still in development and many currently operate unchecked and with standardization gaps. It's within those gaps where technology can inflict the most harm. That鈥檚 why assistant professor area of expertise is on how AI and the law intersect.听

Christian鈥檚 work was recently awarded a $50,000 for a research project titled Mitigating Race, Gender and Privacy Impacts of AI Facial Recognition Technologyto identify the complex issues surrounding private-sector development and deployment of AI-based facial recognition technology in Canada.听

The effects of racially biased AI听

鈥淭here is this false notion that technology unlike humans is not biased. That鈥檚 not accurate,鈥 says Christian, PhD. 鈥淭echnology has been shown (to) have the capacity to replicate human bias. In some facial recognition technology, there is over 99 per cent accuracy rate in recognizing white male faces. But, unfortunately, when it comes to recognizing faces of colour, especially the faces of Black women, the technology seems to manifest its highest error rate, which is about 35 per cent.鈥澨

This is an unacceptable error rate, with damaging effects, Christian says. For people of colour, he says, 鈥淔acial recognition technology can wrongly match your face with that of some other person who might have committed a crime. All you see is the police knocking on the door, arresting you for a crime you never committed.鈥

Christian cites cases in the U.S. when The headline of a May 2023 article in Scientific American minces no words: 听听

But facial recognition technology misuse is not limited to the U.S., Christian notes, adding: 鈥淲e know this technology is being used by various police departments in Canada. We can attribute the absence of similar cases to what you have in the U.S. based on the fact that this technology is So, records may not exist, or if they do, they may not be publicized.听

鈥淲hat we have seen in Canada are cases (of) Black women, immigrants who have successfully made refugee claims, having on the basis that facial recognition technology matched their face to some other person. Hence, . Mind you, these are Black women 鈥 the same demographic group where this technology has its worst error rate.鈥澨

Gideon Christian

Gideon Christian

Facial recognition can be trained to be biased

Facial recognition technology is as pervasive as much as it is invisible and the effect it renders on people鈥檚 lives hides behind impenetrable lines of code, says Christian. 鈥淩acial bias is not new,鈥 he notes. 鈥淲hat is new is how these biases are manifesting in artificial intelligence technology. And this technology 鈥 and this particular problem with this technology, if unchecked 鈥 has the capacity to overturn all the progress we achieved as a result of the civil rights movement.鈥澨

Which is why this is such an important area of research. Facial recognition technology is not only used by law enforcement; its applications are much broader, with the technology also being used in health care, education and finance 鈥 and, as the technology is still relatively new, the long-term consequences are as yet unknown.听

It鈥檚 not that the technology is inherently biased, Christian says; it鈥檚 how it鈥檚 developed that can embed bias. If the data used to train the AI technology itself is biased, such as training facial recognition technologies using mostly white male faces and not faces of colour, he argues, the technology itself will inevitably make biased decisions.听

The importance of addressing racial bias in technology听

鈥淭he majority of us want to live in a society that is free of racial bias and discrimination,鈥 says Christian. 鈥淭hat is the essence of my research, specifically in the area of AI. I don鈥檛 think we want a situation where racial bias, which we have struggled so hard to address, is now subtly being perpetuated by artificial intelligence technology.听

鈥淢y research takes a microscopic view of these technologies with the purpose of identifying elements of racism so it can be stripped from this technology. If we are able to do that, AI will have the most transformative and positive impact on our lives, irrespective of our race or gender.鈥澨

And, with his new grant from the federal government and in collaboration with the , Christian will develop a framework to address the effects of AI facial recognition technology to ensure that the technology is a force for good to all.


Sign up for UToday

Sign up for UToday

Delivered to your inbox听鈥 a daily roundup of news and events from across the 春水堂视频 of Calgary's 14 faculties and dozens of units

Thank you for your submission.