Technology
Thousands sign iPhone petition demanding Apple STOPS scanning private photos – even though it’s to catch paedos
THOUSANDS OF people have signed a petition demanding that Apple halt plans to scan iPhone users’ photos for child sex abuse imagery.
The open letter penned by a group of technology, security and legal experts warns that the proposal poses a serious threat to people’s privacy.
Read all the latest Phones & Gadgets newsKeep up-to-date on Apple storiesGet the latest on Facebook, WhatsApp and Instagram
AlamyApple announced last week that it was planning to scan photos on people’s iPhones for child sex abuse imagery[/caption]
It garnered more than 6,000 digital signatures over the weekend from supporters across the globe.
Apple on August 5 unveiled plans to inspect U.S. iPhones for images of child sexual abuse before they’re uploaded to iCloud.
The move has drawn applause from child protection groups but raised concerns among security researchers and tech experts.
Those concerned claim the system could be misused – particularly by governments who may be looking to spy on their citizens.
In an open letter to Apple released following its announcement, experts warned that the scheme threatened to undo decades of work to keep users’ privacy safe from the technology they use.
“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases,” the letter reads.
“We ask that Apple reconsider its technology rollout, lest it undo that important work.”
NEURALMATCH
Apple’s proposed technology works by continuously monitoring photos saved or shared on the user’s iPhone, iPad, or Mac.
The tool, called neuralMatch, is designed to detect known images of child sexual abuse and will scan such images before they are uploaded to iCloud.
If the system finds a match, the image will be reviewed by a human.
Once child sex abuse content has been confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.
Separately, Apple plans to scan users’ encrypted messages for sexually explicit content as a child safety measure, which also alarmed privacy advocates.
The detection system will, however, only flag images that are already in the center’s database of known child sex abuse images.
BACKDOOR
The writers of the open letter said that, while child exploitation is a serious problem, Apple’s proposal “introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.”
They said that the technology “sets a precedent where our personal devices become a radical new tool for invasive surveillance”.
They blasted the project for its lack of oversight to help prevent eventual abuse and “unreasonable expansion” of the scope of surveillance.
The group requests that Apple halts the deployment of its monitoring technology immediately and issues a statement reaffirming their commitment to user privacy.
Signatories include major organisations such as The New York Public Library and privacy groups including the Privacy Foundation.
GOVERNMENT PRESSURE
Tech companies including Microsoft, Google, Facebook and others have for years been sharing digital fingerprints of known child sexual abuse images.
Apple has used those to scan user files stored in its iCloud service – which is not as securely encrypted as its on-device data – for child sex abuse imagery.
The company has been under government pressure for years to allow for increased surveillance of encrypted data.
Coming up with the new security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.
Apple said the latest changes will roll out this year as part of updates to its operating software for iPhones, Macs and Apple Watches.
“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement.
“With so many people using Apple products, these new safety measures have lifesaving potential for children.”
Meanwhile the Electronic Frontier Foundation, the online civil liberties pioneer, called Apple’s compromise on privacy protections a shocking about-face for users who have relied on the company’s leadership in privacy and security.
SPYING CONCERNS
Matthew Green, a top cryptography researcher at Johns Hopkins University, warned that the system could be used to frame innocent people by sending them seemingly innocuous images designed to trigger matches for child pornography.
That could fool Apple’s algorithm and alert law enforcement, Green said.
He added that researchers have been able to trick such systems pretty easily.
Other abuses could include government surveillance of dissidents or protesters.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked.
“Does Apple say no? I hope they say no, but their technology wont say no.”
WhatsApp chief Will Cathcart has also chimed in, saying that his messaging app would not be adopting the safety measures and calling Apple’s approach “very concerning”.
TwitterWhatsApp chief Will Cathcart has said that the messaging app will not be adopting Apple’s safety measures, calling the approach ‘very concerning’[/caption]
Best Phone and Gadget tips and hacks
Looking for tips and hacks for your phone? Want to find those secret features within social media apps? We have you covered…
How to get your deleted Instagram photos back
How can I change my Facebook password?
How can I do a duet on TikTok?
Here’s how to use your iPhone’s Apple logo as a BUTTON
How can I change my Amazon Alexa voice in seconds?
What is dating app Bumble?
How can I increase my Snapchat score?
How can I test my broadband internet speed?
Here’s how to find your Sky TV remote in SECONDS
Most read in Phones & Gadgets
In other news, a Google Maps fan has spotted a “secret” military base tucked away in the middle of the Sahara desert.
Samsung has teased a glimpse of the design for its highly anticipated Galaxy Z Fold 3 smartphone.
And, the next iPhone will come in a new pink colour and start at just under £800, according to recent rumours.
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at tech@the-sun.co.uk