If you or anyone else has ever uploaded a picture with your face in it to Facebook, Twitter, Instagram, or just about any website, there’s a very good chance you’re in Clearview AI’s database. What does that mean? Clearview AI employees, millions of law enforcement agents, and anyone with access to the company’s data (which was recently exposed in a massive breach) can identify you as easily as snapping a photo with a smartphone or uploading a pic to an app. The Australian government determined that Clearview AI posed a threat to its citizens. Per a report from Gizmodo: Clearview AI founder Hoan Ton-That rebuked the Australian ruling in an email to Gizmodo by claiming his product was an important tool for justice. Per the same article, Ton-That said: It’s perfectly reasonable to assume that Clearview AI’s founder, employees, investors, and partners are all interested in the pursuit of justice. My company and I have acted in the best interests of these two nations and their people by assisting law enforcement in solving heinous crimes against children, seniors, and other victims of unscrupulous acts. We only collect public data from the open internet and comply with all standards of privacy and law. I respect the time and effort that the Australian officials spent evaluating aspects of the technology I built. But, if we’re going to make assumptions, we should make sure they’re informed by evidence and information. For example, Clearview AI has deep, long-standing connections to right-wing extremists. Per this 2020 article by Luke O’Brien, Ton-That was a prominent figure in the alt-right movement as far back as 2015: The article continues to point out that Clearview AI’s “secret co-founder,” white nationalist and avowed racist Chuck Johnson, initially intended for the product to be used by ICE: But none of this answers the question of why someone with nothing to hide should be concerned with facial recognition. The answer is: for the same reason people in the US with no intention of shooting anyone should have a problem with being told they can’t keep and bear arms. Freedom from unreasonable government intrusion is in our Constitution because it’s necessary for democracy to thrive. When the US withdrew from Afghanistan earlier this year, some of its biometric equipment got left behind. As a result, the Taliban gained access not only to hardware capable of scanning fingerprints, irises, and faces, but also to the databases containing information on local civilians. Anyone who’d been scanned by US military forces was immediately identified to the Taliban. Now, we’ve since learned that the group was able to intercept a list of LGBTQPIA+ people in Afghanistan who’d reached out to aid organizations for fear they’d be discovered. Thanks to the US military’s biometric equipment and databases, the Taliban doesn’t have to rely on painstaking detective work to find individual members of minority groups they hope to kill, they can simply point at a camera at everyone they see and shoot the ones who the algorithm matches to their list. But what if you’re in the US, you’re not queer, and you have nothing to hide? I’ll point you to something TNW’s co-founder, Boris, wrote in one of his newsletters awhile back: As World War II and the Vietnam and Gulf War conflicts taught us, there are limits as to what should be allowed in the pursuit of safety and peace. The first thing dictators tend to do is take away peoples’ privacy and the freedom to think what they want. The fact that you don’t feel like you have secrets means you live in a society where you can be who you want to be. That’s great for you, but should be a reason to care even more. The fewer secrets you have, the more you should value privacy. Right now, the people deciding whether Clearview AI should be allowed to operate are its own executives and the law enforcement community. Those might not be the right people to determine the rules of engagement when matters of grave consequence, such as the Constitutionally-protected right to privacy that every US citizen is guaranteed, are what’s at stake. Ultimately, none of us consented to Clearview AI’s use of our images. Ton-That’s lined his pockets selling a product built on our photos. And you and I haven’t seen a penny of profit from it. What Clearview AI is doing may not be illegal in the US, yet, but it’s clearly unethical. Just look at the great lengths the company’s gone to trying to erase any evidence of Ton-That’s ties to right-wing conspiracy theorists, neo-Nazis, and white nationalists. As it turns out, everyone either has something to hide or something to lose when their privacy is taken away.