Connect with us

Technology

Apple changes course on plan to scan users’ iPhones for child abuse pics as exec admits company ‘jumbled’ announcement

APPLE announced on Friday that it will only search devices for child sex abuse images that have been flagged in multiple countries, after the tech giant received backlash over its planned new scanning system.

In early August, the company announced it is planning to scan some photos on iPhones, iPads and Mac computers for images depicting child sex abuse.

Getty Images – GettyApple gave more details about its planned device-scanning program on Friday[/caption]

Getty ImagesThe company angered privacy advocates when it announced it is planning to scan some photos on iPhones, iPads and Mac computers for images depicting child sex abuse it[/caption]

The move angered privacy advocates and security researchers, who claimed the tool could be used for surveillance or in government censorship.

On Friday, however, the company – which admitted it had handled “jumbled” communications around the program poorly – appeared to be opening up about its plans.

Having previously declined to say how many images would need to be detected on a phone or computer before the operating system notifies Apple for a human review, executives said on Friday that it would start with 30, according to Reuters.

As the system improves, execs added, that number could become lower.

CLEARER PLANS

Apple SVP Craig Federighi told The Wall Street Journal in an interview Friday: “If and only if you meet a threshold of something on the order of 30 known child pornographic images matching, only then does Apple know anything about your account and know anything about those images, and at that point, only knows about those images, not about any of your other images.” 

To address concerns that the new system could be used to target individuals, Apple said that the list of image identifiers being sought on one iPhone would be the same on all phones.

Apple announced at the beginning of August its system would match photos with ones provided by the National Center for Missing and Exploited Children to detect child pornography.

The center is the only clearinghouse to have signed an agreement with Apple so far – though the company said it will only hunt images that have been flagged by clearinghouses in multiple countries.


The company published a 14-page guide, offering reasoning for the system and defenses against potential attacks.

Friday’s move comes after some Apple employees even expressed their concerns that the new tool may be jeopardizing the company’s reputation for protecting users’ privacy.

Apple denied to say whether criticism had changed any of its policies or the software itself.

ADVOCATES ANGERED

By providing more information, Apple hopes to convince critics who have been hostile towards the plans.

“Our pushing is having an effect,” tweeted Riana Pfefferkorn, an encryption and surveillance researcher at Stanford University told Reuters.

Apple said last week that it will check photos if they are about to be stored on the iCloud online service.

The company later said that it would begin by only using the new tool in the United States.

ReutersApple SVP Craig Federighi said that 30 images would to be flagged before the operating system notifies Apple for a human review[/caption]

Exit mobile version