Horst D. Deckert

Indiana Cop Used Facial Recognition Scans to Preform Non-Work-Related Searches

Clearview_AI.jpg

The facial recognition company used by law enforcement represents an unofficial public-private partnership.

The use of Clearview’s facial recognition tech by US law enforcement is controversial in and of itself, and it turns out some police officers can use it “for personal purposes.”

One such case happened in Evansville, Indiana, where an officer had to resign after an audit showed the tech was “misused” to carry out searches that had nothing to do with his cases.

Clearview AI, which has been hit with fines and much criticism – only to see its business go stronger than ever, is almost casually described in legacy media reports as “secretive.”

But that sits badly in juxtaposition of another description of the company, as peddling to law enforcement (and the Department of Homeland Security in the US) some of the most sophisticated facial recognition and search technology in existence.

However, the Indiana case is not about Clearview itself – the only reason the officer, Michael Dockery, and his activities got exposed is because of a “routine audit,” as reports put it. And the audit was necessary to get Clearview’s license renewed by the police department.

In other words, the focus is not on the company and what it does (and how much of what and how it does, citizens are allowed to know) but on there being audits, and those ending up in smoking out some cops who performed “improper searches.” It’s almost a way to assure people Clearview’s tech is okay and subject to proper checks.

But that remains hotly contested by privacy and rights groups, who point out that, to the surveillance industry, Clearview is the type of juggernaut Google is on the internet.

And the two industries meet here (coincidentally?) because face searches on the internet are what got the policeman in trouble. The narrative is that all is well with using Clearview – there are rules, one is to enter a case number before doing a dystopian-style search.

“Dockery exploited this system by using legitimate case numbers to conduct unauthorized searches (…) Some of these individuals had asked Dockery to run their photos, while others were unaware,” said a report.

But – why is any of this “dystopian”?

This is why. Last March, Clearview CEO Hoan Ton-That told the BBC that the company had to date run nearly one million searches for US law enforcement matching them to a database of 30 billion images.

“These images have been scraped from people’s social media accounts without their permission,” a report said at the time.


Emergency Broadcast: Feds Fail To Take Over InfoWars – Learn What Comes NEXT – FULL SHOW – 06.15.2024


Ähnliche Nachrichten