
Algorithmic Injustice? Racial bias and facial recognition
Special | 7m 15sVideo has Closed Captions
A tool for law enforcement, critics argue that facial recognition tech is discriminatory
A useful tool for consumers and law enforcement alike, facial recognition technology can help police officers identify—and ultimately charge—criminals caught on camera. But its critics argue that it's discriminatory: Research shows that facial recognition software often misidentifies people of color at a much higher rate than white individuals.
WKAR Specials is a local public television program presented by WKAR

Algorithmic Injustice? Racial bias and facial recognition
Special | 7m 15sVideo has Closed Captions
A useful tool for consumers and law enforcement alike, facial recognition technology can help police officers identify—and ultimately charge—criminals caught on camera. But its critics argue that it's discriminatory: Research shows that facial recognition software often misidentifies people of color at a much higher rate than white individuals.
How to Watch WKAR Specials
WKAR Specials is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
More from This Collection
Video has Closed Captions
Will next generation batteries accelerate adoption of electric vehicles? (6m 29s)
Twice as Likely: Black Infant Mortality Rates in the U.S
Video has Closed Captions
Why are Black infants twice as likely as White infants to die before their first birthday? (6m 4s)
Cannabis and Driving Under the Influence
Video has Closed Captions
Discover the science behind THC levels, roadsides tests, intoxication and cannabis. (6m 50s)
Video has Closed Captions
Exploring 'zoom fatigue' and the negative effects of video conferencing technology. (6m 35s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- They told me I had an outstanding felony warrant out for my arrest.
I kind of figured it was a joke, until I got took out the car and put in handcuffs.
- [Narrator] In 2019, Michael Oliver was arrested during a routine traffic stop and charged with larceny for stealing a cell phone.
- I was charged with something that I didn't do, a crime that I didn't commit.
- [Narrator] Oliver was misidentified when Detroit police ran a facial recognition search in the state of Michigan photo database, using technology created by an outside company.
- We can expect those misidentification cases using face recognition to happen all over, especially when it comes to black and brown communities, because darker skin tones have a more difficult time being recognized by this technology.
- We simply don't know how many people not only have been falsely accused, but have been falsely convicted because of facial recognition technology.
Many people accept plea bargains, even when they're innocent, because it's the safest way to make a case go away.
- [Narrator] Facial recognition technology uses biometric algorithms to recognize and match images of individuals in large photo databases.
- The word biometrics is made up of two Greek words, Bios, which means life, and Metron, which means measurement.
And when we talk about biometric recognition, then we are talking about recognizing a person based on his or her body measurements, fingerprints, iris of the eye, and face.
- [Narrator] There are two types of biometric face recognition systems.
Authentication, like using your face to log into your phone, or using your passport photo to check in at an airport kiosk.
The other is search identification, which can be done using images captured on the closed circuit surveillance cameras that we see everywhere.
Both methods rely on a type of machine learning called deep learning.
- When you want to design a deep network for face recognition, you need large amounts of data.
By presenting the faces of different individuals, the network is supposed to learn.
- [Narrator] But a growing body of research, including from the National Institute of Standards and Technology and MIT, show that currently, these deep networks are also learning bias.
- This is largely thought to be because the technology has been trained on predominantly white images.
- If we don't show them enough variety of faces, then the network is not learning enough to separate different demographic groups.
- [Narrator] Critics say the use of facial recognition technology negatively impacts communities of color.
- Quite simply surveillance in the United States is more prevalent in black and brown communities.
A lot of the systems around the country, these are arrest databases, not conviction databases, meaning that if you're arrested and the charges are dropped, or you're adjudicated not guilty your face still remains in this database.
Black people, particularly young black men are disproportionately enrolled in these databases.
- [Narrator] Many AI systems aren't adequately trained on enough black and brown faces, which leads to more frequent misidentifications of people of color from the arrest databases.
In the case of Michael Oliver, he's filed a $12 million lawsuit against the city of Detroit, alleging there wasn't proper investigative follow through by the police to corroborate his identification.
- He didn't even know until he was at court and saw the image that they were using, and he said, "That's not me."
And the judge was like, oh that's not you.
- The guy didn't look nothing like me, so it really shocked me.
I have multiple tattoos that I had for 10 years plus, and he, the guy, have no tattoos at all.
- There was this overwhelming apparently reliance on this flawed facial recognition system.
- A human, an officer, looking at those two photos should have been able to say "Hey, this is not the same person," but research shows that we are more likely to agree with what a machine tells us and override our common sense evaluation of whether two people are or are not the same person.
- [Narrator] Detroit police won't comment on Michael Oliver's case because of the pending litigation.
But his is one of two publicized false arrests of black men by Detroit police because of incorrect identification by facial recognition software.
Like Oliver, the other man was arrested for the non-violent crime of larceny.
At the urging of the public and activists, Detroit police instituted a facial recognition policy in September, 2019 to prevent future misidentifications.
- Those cases were larceny cases, a crime that predates our policy, and would not be acceptable with our current policy.
What we have done in Detroit is built in safeguards to help overcome implicit bias.
We use it for the most violent crimes only.
It's not one analyst who's deciding that this person is the one who is the candidate for the probe photo, but that analyst has to have his or her decision corroborated by another analyst, and then once that's done, prior to going to the detective as a lead, the supervisor has to sign off on it.
So prior to policy none of that existed.
- [Narrator] The larceny charges against Michael Oliver were dismissed at trial.
- You know, I lost a job, I couldn't pay rent.
I couldn't pay my car note.
I thought I was going to lose the trial and end up incarcerated.
- If this technology is not good enough to identify people accused of minor crimes, why is it good enough to be used to accuse people of the most serious crimes that can result in years or decades of incarceration?
- We have people that are suffering out there because they're a victim of robbery, they're a victim of aggravated assault, they're a victim of a homicide, and we have to keep up and use and leverage all the technology that we can.
- [Narrator] Most jurisdictions around the country use some sort of facial recognition technology in policing, but many don't have standards or protocols to catch incorrect identifications and prevent wrongful arrests.
- The more information the public has about face recognition the more interest there is in just banning it outright, or placing a pause on the use of the technology, until we put very clear controls in place, or until we have a better understanding about why it exhibits bias, and whether we can remove that bias.
- [Narrator] And some cities, including Boston and San Francisco, have banned the use of facial recognition by law enforcement, a move applauded by activists.
- I am not an advocate for face recognition to any degree.
I think that it's a slippery slope, and I think that always it's going to be used in a way that's going to violate the civil liberties of black and brown communities, and I think that that is a very dangerous trajectory to be moving in.
(dark electronic music) (light piano music)
WKAR Specials is a local public television program presented by WKAR