img

Anti-human technology is spreading to North America

Facial recognition tech is insidious and invasive. Users of the software claim it enhances safety, secures streets, targets criminals, but in practice, it targets everyone and is anathema to liberty and individual rights.

ADVERTISEMENT
Image
Libby Emmons Brooklyn NY
ADVERTISEMENT

Facial recognition tech is insidious and invasive. Users of the software claim it enhances safety, secures streets, targets criminals, but in practice, it targets everyone and is anathema to liberty and individual rights. So it’s no wonder that the Chinese government is so adept at wielding this weapon. They use it against the Uyghur minority in Xinjiang Province, they’re rolling it out to track residents in public housing in Beijing, and it’s been deployed against protesters in Hong Kong. But the protesters are fighting back, and it’s spectacular.

Using laser beams to disrupt the cameras is a brilliant move and one that should be adopted by literally everyone whose face comes under the scrutiny of this tech. If you think this sounds paranoid, talk to artist Leonardo Selvaggio, who came up with the idea to combat facial recognition tech in use by the Chicago Police Department by giving out masks of his own face to everyone. That’s cute and all that, but not practical. It’s fine to be recognized, but not to have your face data mined.

Citizens across North America are speaking out, but law enforcement, border control, and travel security administrations don’t care. They’re far more enthralled with stories like this out of China, of a drug trafficker thwarted from escape by transit cams capturing his image and sending it along to authorities.

Parents love the idea that they can track their kids at summer camp, even while those kids are supposedly learning how to navigate independence. The Wall Street Journal reports that facial recognition tech is being used by schools for discipline. Once the tech is in place, there’s no limit on how it can be used. If we’ve discovered anything in this data age, it’s that once the data exists, it can be deployed to sell us things and to manipulate us.

Those in the business of thwarting crime would have us believe that only criminals need to fear, but it affects all of us. Once we’re afraid of our own image, we won’t move freely. We’ll be in prisons of our own manufacture. We don’t even know when it’s being used. It’s all over the place, in shopping malls, sports venues, employed by local governments, private businesses, and much of it is purchased from companies like China’s ZTE. This opens the door to concerns that there’s a back door in the tech that can be accessed by China’s authoritarian regime. The US government has already expressed these concerns.

We can try to stop it, by speaking out, petitioning our elected representatives, and letting our preference for individual citizens to not be tracked. But so far, despite individual rights advocacy groups, like the AI Now Institute, sending up red flags, smoke signals, and reports about these dangers, it’s out there. On top of everything else, it often misidentifies people.

So how do we break out? Disruptive laser beams are a great option, wearing a “white man” mask is a less good option, although if you really want to go for it, Eddie Murphy showed just how much you can get away with by fully embodying a white man.

There are a few ways to confound the tech. As the tech improves, and facial recognition is combined with gait identification, it will be harder and harder to move about freely. But until then, we can either obfuscate or confound the tech.

Obfuscation involves completely covering up, which is illegal in many western nations, and is also not exactly invisible since you’d pretty much be the only one. Balaclavas don’t tend to fool the tech, bandanas aren’t enough, even religious modesty gear leaves enough of the eyes free enough for recognition to happen.

Confounding the tech is a little different, and involves makeup that makes your face look not like a face. Heavy contrast, like Juggalo makeup, that disrupts the recognition points can do it. The only problem is that the more you try to interfere with the cameras, the more you mess with your ability to move freely among actual human beings. The more invisible you are to the camera, the more you stand out in public.

The terrifying part is that there’s not much we can do to thwart this stuff from tracking us. There is literally no upside to facial recognition tech that is worth the cost of liberty and freedom of movement. We give away so many of our rights in the name of convenience. We give up our data to social media, our locations to any app that asks, we let Siri and Alexa listen just so they will be there when we reach out, forgetting about the long, digital tendrils that are reaching back. We give away our images to stupid apps that want to show us what we’ll look like when we’re old, when all we really have to do is look at our grandparents to see the truth of ageing.

It’s up to us to decide what we value more, our own liberty and a free society where we don’t have to obfuscate our images and confound computers, or being tracked, quantified, and categorized for our own safety. For my part, I’d rather be free than safe, and I’m pretty sure we can’t have both conditions simultaneously.

ADVERTISEMENT
ADVERTISEMENT

Join and support independent free thinkers!

We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.

Support The Post Millennial

Remind me next month

To find out what personal data we collect and how we use it, please visit our Privacy Policy

ADVERTISEMENT
ADVERTISEMENT
By signing up you agree to our Terms of Use and Privacy Policy
ADVERTISEMENT
© 2024 The Post Millennial, Privacy Policy | Do Not Sell My Personal Information