My Blog
Technology

Apple abandons controversial plan to check iOS devices and iCloud photos for child abuse imagery

Apple abandons controversial plan to check iOS devices and iCloud photos for child abuse imagery
Apple abandons controversial plan to check iOS devices and iCloud photos for child abuse imagery





CNN
 — 

Apple is abandoning its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material (CSAM) following backlash from critics who decried the feature’s potential privacy implications.

Apple first announced the feature in 2021, with the goal of helping combat child exploitation and promoting safety, issues the tech community has increasingly embraced. But it soon put the brakes on implementing the feature amid a wave of criticism, noting it would “take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

In a public statement Wednesday, Apple said it had “decided to not move forward with our previously proposed CSAM detection tool for iCloud Photos.”

“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the company said in a statement provided to Wired. (Apple did not respond to CNN’s request for comment.)

Instead, the company is refocusing its efforts on growing its Communication Safety feature, which it first made available in December 2021, after consulting experts for feedback on its child protection initiatives. The Communication Safety tool is an opt-in parental control feature that warns minors and their parents when incoming or sent image attachments in iMessage are sexually explicit and, if so, blurs them.

Apple was criticized in 2021 for its plan to offer a different tool that would start checking iOS devices and iCloud photos for child abuse imagery. At the time, the company said the tool would turn photos on iPhones and iPads into unreadable hashes — or complex numbers — stored on user devices. Those numbers would be matched against a database of hashes provided by the National Center for Missing and Exploited Children (NCMEC) once the pictures were uploaded to Apple’s iCloud storage service.

Many child safety and security experts praised the attempt, recognizing the ethical responsibilities and obligations a company has over the products and services it creates. But they also called the efforts “deeply concerning,” stemming largely from how part of Apple’s checking process for child abuse images is done directly on user devices.

In a PDF published to its website outlining the technology, which it called NeuralHash, Apple attempted to address fears that governments could also force Apple to add non-child abuse images to the hash list. “Apple will refuse any such demands,” it stated. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future.”

Apple’s announcement about killing its plans for the tool came around the same time the company announced a handful of new security features.

Apple plans to bring expanded end-to-end encryption of iCloud data to include backups, photos, notes, chat histories and other services, in a move that could further protect user data but also add to tensions with law enforcement officials around the world. The tool, called Advanced Data Protection, will allow users to keep certain data more secure from hackers, governments and spies, even in the case of an Apple data breach, the company said.

Related posts

Twitter-Musk Update: Musk Declares ‘Amnesty,’ Also Reportedly Fires More Staff Before Holiday

newsconquest

Sam Bankman-Fried Is Found Guilty of 7 Counts of Fraud and Conspiracy

newsconquest

F.C.C. Commissioner Pushes Apple and Google to Remove TikTok

newsconquest