Last Thursday, Apple announced that their iPhone technology will now be able to find child sex abuse picture content with customer devices in the United States. Apple's new technology in regards to tracking photos of that nature is defined as CSAM – child sexual abuse material.
Before a photo is uploaded to iCloud, the new iPhone technology will be able to scan the said image to see if it matches CSAM criteria. If so, it will be passed on to a human reviewer and reported to local law enforcement.
According to BBC, the new iPhone technology also works by comparing photos to a database of known sexual abuse pictures collected from the US National Center for Missing and Exploited Children (NCMEC) and other child safety organizations.
Despite this, however, Apple's new technology has drawn criticism as people worry about privacy concerns. Experts fear the new technology will be later used to track political speech, personal content, and even to spy on a country's residents.
Join and support independent free thinkers!
We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.
Remind me next month
To find out what personal data we collect and how we use it, please visit our Privacy Policy