
AI has made plenty of silly mistakes, leading to chuckles and humorous comments. But it’s all fun and games until someone gets hurt. Sadly, this is exactly what happened to a pregnant woman in Detroit who was mistakenly arrested due to a flaw in facial recognition software.
Detroit police arrested 32-year-old Porcha Woodruff after AI facial recognition software had mistaken her for someone else. Woodruff, who was very visibly eight months pregnant, was falsely identified for robbery and carjacking. She was held for 11 hours, questioned, and her iPhone was confiscated. And you know what’s even crazier? The surveillance video which led to Woodruff’s false identification didn’t even show a pregnant woman!
How did this happen?
The story began when a robbery victim reported his case to the Detroit Police Department. Using a system called DataWorks Plus, police compared the surveillance video to a database of criminal mug shots. This is when a 2015 mug shot of Woodruff popped up as a match. A mug shot from an unrelated case, of course. To make matters worse, the victim mistakenly confirmed her identity from a lineup of photos.
Woodruff was arrested and released on a $100,000 bond after 11 hours in detention. The Wayne County prosecutor eventually dismissed her charges. However, Woodroof’s ordeal took a toll on her health, leading to hospitalization for dehydration and leaving her traumatized.
After the whole incident, Woodroof reportedly filed a lawsuit against Detroit. The city’s police chief, James E. White, said that the “allegations are concerning” and that the police are taking the matter seriously.
Woodroof’s attorney Ivan L. Land warns about the risks of over-relying on facial recognition. “It’s scary,” he says, “someone always looks like someone else.”
The dangers of AI facial recognition
Needless to say, this isn’t the first time something like this has happened. According to The New York Times, this incident is the sixth recently reported case where someone was falsely accused due to AI facial recognition mistake. And all six of the falsely accused people have been Black. The Times notes that the Detroit Police Department primarily uses the technology on Black men, conducting around 125 searches annually.
A 2022 Georgetown Law report expressed concerns. “Despite 20 years of reliance on face recognition as a forensic investigative technique, the reliability of face recognition as it is typically used in criminal investigations has not yet been established,” the report states.
So why do these mistakes happen? There are various reasons: First, the police forces use algorithms that haven’t been proven effective. Then there are biased training datasets, which we’ve discussed before. Sometimes, this can also happen because of varying photo angles and low-quality images. Ultimately, adding to the concern is also “automation bias,” where people tend to trust machine decisions, even when evidence suggests they shouldn’t. When you bring all of this together, it creates much room for misidentification, poor decisions, and, ultimately, hurting someone who’s completely innocent.
[via ArsTechnica]
FIND THIS INTERESTING? SHARE IT WITH YOUR FRIENDS!