Apple to scan U.S. iPhones for child sexual abuse imagery

National

FILE – In this Saturday, March 14, 2020 file photo, an Apple logo adorns the facade of the downtown Brooklyn Apple store in New York. Apple is heading into a trial that threatens to upend the app store that brings in billions of dollars each year while feeding the more than 1.6 billion iPhones, iPads, and other devices at the core of its digital empire. The federal court case is being brought by Epic Games, the maker of the popular video Fortnite video game, in an attempt to topple the so-called “walled garden” that Apple has painstakingly built around its products. (AP Photo/Kathy Willens, File)

(NEXSTAR) – Apple unveiled plans to scan U.S. iPhones for images of child sexual abuse, drawing applause from child protection groups but raising concern among some security researchers that the system could be misused by governments looking to surveil their citizens.

Apple said its messaging app will use on-device machine learning to warn about sensitive content without making private communications readable by the company. The tool Apple calls “neuralMatch” will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a human who can notify law enforcement if necessary.

But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.

Matthew Green of Johns Hopkins, a top cryptography researcher, was concerned that it could be used against innocent people by sending them harmless images designed to appear as matches for child porn, fooling Apple’s algorithm and alerting law enforcement – essentially framing people. “Researchers have been able to do this pretty easily,” he said.

Tech companies including Microsoft, Google, Facebook and others have for years been sharing “hash lists” of known images of child sexual abuse. Apple has also been scanning user files stored in its iCloud service, which is not as securely encrypted as its messages, for such images.

Some say this technology could leave the company vulnerable to political pressure in authoritarian states such as China. “What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green said. “Does Apple say no? I hope they say no, but their technology won’t say no.”

The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data. Coming up with the security measures required Apple to perform a delicate balancing act between cracking down on the exploitation of children while keeping its high-profile commitment to protecting the privacy of its users.

Apple believes it pulled off that feat with technology that it developed in consultation with several prominent cryptographers, including Stanford University professor Dan Boneh, whose work in the field has won a Turing Award, often called technology’s version of the Nobel Prize.

The computer scientist who more than a decade ago invented PhotoDNA, the technology used by law enforcement to identify child pornography online, acknowledged the potential for abuse of Apple’s system, but said it was far outweighed by the imperative of battling child sexual abuse.

“It possible? Of course. But is it something that I’m concerned about? No,” said Hany Farid, a researcher at the University of California, Berkeley, who argues that plenty of other programs designed to secure devices from various threats haven’t seen “this type of mission creep.” For example, WhatsApp provides users with end-to-end encryption to protect their privacy, but employs a system for detecting malware and warning users not to click on harmful links.

Apple also described on its Child Safety page how the company is instituting new tools for parents in its messaging app:

When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

(Apple)

“Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit,” Apple continues. “The feature is designed so that Apple does not get access to the messages.”

The communication safety tool will be available to U.S. accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey and will arrive in an update later this year.

Apple was one of the first major companies to embrace “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them. Law enforcement, however, has long pressured for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.

“Apple’s expanded protection for children is a game changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

The Associated Press contributed to this report.

Copyright 2021 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Don't Miss

More Don't Miss

Coronavirus News

More Coronavirus

Trending Stories

Don't Miss

Your Local Election HQ

More Election

Local Sports

Bostick on time in Green Bay

Bostick on next chapter

Bostick on mental health

Bostick on Aaron Rodgers speaking out about mental health

Bostick on 2014 NFC Championship Game

Bostick on coaching youth football