Five months is a long time to sit in a jail cell, but for Angela Lipps, a 50-year-old grandmother from Tennessee, it was a nightmare that started with a digital pixel. It’s a scenario no one wants to imagine: a 911 call about bank fraud, the sound of a warrant, and a reality check that the person sitting in a cell isn’t actually a criminal. Lipps was arrested in July 2025, not for a crime she committed, but because an AI system thought she looked like someone else. She was in Tennessee, but the warrant came from North Dakota, a state she says she’d never even visited.
The Digital Mistake
Here’s how it happened. Police in Fargo, North Dakota, were looking for someone involved in a bank fraud case. They turned to the West Fargo Police Department, who were using Clearview AI. This platform is controversial—it scrapes billions of photos from the internet, including social media, to build a database for law enforcement. The AI flagged Angela Lipps based on a photo from a fake ID used in West Fargo. The report was forwarded to Fargo detectives, but there was a crucial detail missing: it was an AI match, not surveillance footage.
Why It Happened
Fargo police, unaware that the image was computer-generated, built their entire case on a “match” that wasn’t there. They issued a warrant on July 1, 2025, and detained her on July 14. For five months, she sat in jail while prosecutors tried to build a case around a face ID algorithm. It should have been simple. Basic police work would have checked if Lipps had ever been to North Dakota, but the assumption was that if the AI said she was there, she was. No one verified the reality. Lipps’ own bank records eventually showed she was in Tennessee during the crimes, a fact that should have been obvious to anyone looking at them, yet it took months to uncover.
The Fallout
It wasn’t until December 12, 2025, that the State’s Attorney’s Office saw the evidence. By December 23, the charges were dismissed without prejudice, and she was released on Christmas Eve. The damage, however, is already done. Lipps’ lawyers have called the detention “unnecessary” and pointed out that “the trauma, loss of liberty, and reputational damage cannot be easily fixed.” It’s a harsh reality of our digital age: your digital footprint, whether a 2015 vacation selfie or a professional headshot, can be used against you, even when it’s completely wrong.
Chief Dave Zibolski from the Fargo Police Department confirmed that his department doesn’t run the facial recognition system, but they relied on a report from a neighboring agency. He acknowledged that gaps in investigation contributed to the error, though he didn’t issue an apology, only pointing to “a few errors.” The department has since banned using the West Fargo AI information, added monthly oversight, and promised improved warrant procedures.
This case highlights a terrifying gap in modern law enforcement. When a tool like Clearview AI is used as the “only tool” without human verification, mistakes are inevitable. As one expert noted, “It’s not just a technology problem, it’s a technology and people problem.” The system failed because it relied on the speed of code over the scrutiny of a human mind, a combination that can easily turn a grandma from a victim into a defendant.
