Accessibility links

Breaking News

Apple’s Plan to Search for Child Sexual Images Concerns Privacy Activists


This May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop computer in New York. (AP Photo/Mark Lennihan)
This May 21, 2021 photo shows the Apple logo displayed on a Mac Pro desktop computer in New York. (AP Photo/Mark Lennihan)
Apple’s Plan to Search for Child Sexual Images Concerns Privacy Activists
please wait

No media source currently available

0:00 0:06:48 0:00

Apple recently announced plans to use a tool designed to identify known child sexual images on iPhones.

The decision was praised by child protection groups. But some privacy activists and security researchers have raised concerns. They warn that the system could be misused to search for other kinds of information or be used by governments to watch citizens.

How does it work?

Apple says the tool, called “NeuralHash,” will scan all images kept on the device that are sent to iCloud, the company’s online storage system. iPhone users can choose in their settings whether to send photos to iCloud or have them remain on the device. If the images are not sent to iCloud, Apple says they will not be scanned by the new tool.

The system searches for photos included in a database of known child sexual abuse images collected by law enforcement. Apple’s scanning system will change the images into a “hash.” This is a numerical piece of data that can identify the images but cannot be used to recreate them. This hash will be uploaded and compared against the law enforcement image database.

If the system matches an image with one in the database, it will be examined by a human. If the person confirms the image as a match, the device user’s account will be locked and the National Center for Missing and Exploited Children (NCMEC) will be contacted.

The system is designed to only identify images already included in the existing database. Apple says parents taking innocent photos of unclothed children need not worry about such images being identified.

FILE - A salesman uses his iPhone at a mobile phone store in New Delhi, India, July 27, 2016. (REUTERS/Adnan Abidi)
FILE - A salesman uses his iPhone at a mobile phone store in New Delhi, India, July 27, 2016. (REUTERS/Adnan Abidi)


Concerns about possible abuse

Some security researchers have criticized the way NeuralHash “sees” the images and say the system could be used for dangerous purposes.

Matthew Green is a top cryptography researcher at Johns Hopkins University. He told the Associated Press that he fears the system could be used to accuse innocent people. It could send users images that seem harmless but that the system would report as child sexual material. Green said researchers have been able to easily fool similar systems in the past.

Another possible abuse could be a government seeking to watch dissidents or protesters. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked. “Does Apple say no? I hope they say no, but their technology won’t say no.”

In an online explanation of its system, Apple said it “will refuse any such (government) demands.”

Apple has been under pressure from governments and law enforcement to permit increased observation of data that it encrypts on its devices. The company said its new tool was designed to operate “with user privacy in mind.” It also claimed the system was built to reduce the chance of misidentification to one in one trillion each year.

However, some privacy researchers said the system represents a clear change for a company that has been praised for its leadership on privacy and security.

In a joint statement, India McKinney and Erica Portnoy of the Electronic Frontier Foundation warned that Apple’s new tool “opens a backdoor to your private life.” The two noted that it may be impossible for outside researchers to confirm whether Apple is operating the system as promised.

Apple’s system was also criticized by former U.S. National Security Agency contractor Edward Snowden. Snowden lives in exile because he is wanted in the U.S. on spying charges linked to his release of information on secret government programs for gathering intelligence.

He tweeted that with the new tool, Apple was offering “mass surveillance to the entire world.” Snowden added: “Make no mistake, if they can scan for kiddie porn today, they can scan for anything tomorrow.”

Separately, Apple announced it was adding new tools to warn children and parents when sexually explicit images are received or sent. This system is designed to identify and blur such images and warn children and parents about the content. Apple says the tool will only work for messages in child accounts registered in the company’s Family Sharing system.

Apple said the changes will come out later this year with new releases of its device operating systems.

I’m Bryan Lynn.

The Associated Press, Reuters and Apple reported on this story. Gregory Stachel and Bryan Lynn adapted the reports for VOA Learning English. Mario Ritter, Jr. was the editor.

We want to hear from you. Write to us in the Comments section, and visit our Facebook page.

Quiz - Apple’s Plan to Search for Child Sex Images Criticized by Privacy Activists

Quiz - Apple’s Plan to Search for Child Sex Images Criticized by Privacy Activists

Start the Quiz to find out

___________________________________________

Words in This Story

scan v. to look at (something) carefully usually in order to find someone or something

match n. a person or thing that is equal to another

cryptographyn. the use of special codes to keep information safe in computer networks

encrypt v. to change (information) from one form to another especially to hide its meaning

surveillance n. the act of carefully watching activities of people especially in order to control crime or the spread of disease

porn (pornography)n. movies, pictures, magazines, etc., that show or describe naked people or sex in an open and direct way in order to cause sexual excitement

explicit adj. showing or talking about sex or violence in a very detailed way

blur v. to make (something) unclear or difficult to see or remember

XS
SM
MD
LG