Follow Datanami:
August 26, 2021

Can Apple Right its Privacy and Security Cart?

(Creative-icon-styles/Shutterstock)

For years, Apple was hailed as the company that “got” security and privacy. After all, it allowed users to engage in conversations protected by encryption, end-to-end, and rebuffed US Government requests to unlock iPhones of domestic terrorists. However, Apple’s new plan to use machine learning techniques to search individual’s iPhones for illegal content is raising questions about the company’s long-term commitment to individual privacy and security.

The change Apple announced August 5 is designed to prevent the spread of Child Sexual Abuse Material (CSAM). When a user attempts to upload an image to iCloud, the iOS operating system will trigger machine learning algorithms that determine if it matches the hash of any images logged in the CSAM database, a copy of which will be stored on the user’s device. In the event that an image is matched to the CSAM database, the government will to be automatically notified.

In its announcement, Apple says it developed its method “with user privacy in mind. Instead of scanning images in the cloud,” Apple says. “the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

“Before an image is stored in iCloud Photos,” the company continues, “an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image.”

A second system was also announced that detects “sexually explicit” images sent via iMessages. In the event that a sexually explicit is send via iMessages, the receiving phone will blur the images and provide a notification to the child asking them if they really want to view the unblurred image. If the child says yes, then the child’s parents optionally will receive a notification. It also announced changes to Siri and search. These updates are set to be rolled out later this year to iCloud for iOS 15, iPadOS 15, and macOS Monterey, Apple said.

The uproar from privacy and security advocates was immediate, and it was loud.

Apple’s CSAM architecture (from Apple’s “Expanded Protections for Children Technology Summary”)

“Apple can explain at length how its technical implementation will preserve privacy and security in its proposed backdoor,  but at the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” wrote Electronic Frontier Foundation Director of Federal Affairs India McKinney and Erica Portnoy, a senior staff technologist, on Electronic Frontier Foundation’s blog.

“Apple’s compromise on end-to-end encryption may appease government agencies in the U.S. and abroad,” they continued, “but it is a shocking about-face for users who have relied on the company’s leadership in privacy and security.”

Jonathan Mayer and Anunay Kulshrestha, two Princeton academics who built a system similar to Apple’s, also took issue with Apple’s casual approach to snooping.

“We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous,” Mayer and Kulshrestha wrote in an August 19 Washington Post opinion piece.

“We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption,” the researchers wrote. “The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted.”

“But we encountered a glaring problem,” they continued. “Our system could be easily repurposed for surveillance and censorship. The design wasn’t restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.”

There would be nothing to stop a government agency, such as the People’s Republic of China, from using Apple’s backdoor to further its own means. “What stops the Chinese government from demanding Apple scan those devices for pro-democracy materials?” they asked. “Absolutely nothing, except Apple’s solemn promise.”

On August 19, a coalition of more than 90 organizations, including the EFF, the Center for Democracy & Technology, and the ACLU, wrote a letter to Apple CEO Tim Cook asking him to reconsider the new feature.

“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups wrote in the letter. “We urge Apple to abandon those changes and to reaffirm the company’s commitment to protecting its users with end-to-end encryption.”

Apple claims its private set intersection approach will set a strong threshold to prevent inadvertent decryption of legitmate content (from Apple’s “Expanded Protections for Children Technology Summary”)

Technology firms are under no obligation to host unlawful materials on their servers. But Apple’s new surveillance system goes beyond that, argues Edward Snowden, the exiled former NSA contractor who blew the whistle on the Federal Government’s massive snooping apparatus.

“The task Apple intends its new surveillance system to perform–preventing their cloud systems from being used to store digital contraband, in this case unlawful images uploaded by their customers–is traditionally performed by searching their systems,” Snowden wrote on his blog yesterday. “While it’s still problematic for anybody to search through a billion people’s private files, the fact that they can only see the files you gave them is a crucial limitation.

“Now, however, that’s all set to change,” Snowden continued. “Under the new design, your phone will now perform these searches on Apple’s behalf before your photos have even reached their iCloud servers, and—yada, yada, yada—if enough ‘forbidden content’ is discovered, law-enforcement will be notified.”

The line separating data that’s stored in the cloud versus data that’s stored on individual’s device has been getting blurry for a while. With the ready availability of streaming content and fast wireless connections, many users are content to let content providers stream music, videos, TV shows, and movies.

While Apple’s has a noble goal in mind with its new CSAM, the distinction between the rights and responsibilities of the individual and the rights and responsibilities of global tech firms and state actors is becoming thin.

“Neither private companies nor governments should search one’s personal effects without following due process of law, which bars the suspicion-less searches Apple now intends to conduct,” tweeted Tim Sweeney, the CEO of video game maker Epic Games, which is currently suing Apple in Federal Court over allegedly unfair practices in its App Store.

“Governments want this search capability but many, including America, are constitutionally prohibited from it. Perhaps Apple thinks that if they give governments this massive surveillance gift at this critical time, regulators will look the other way on their antitrust abuses,” Sweeney continued on Twitter. “My fear is that what Apple is ultimately trying to backdoor here is not our iPhones, but democracy and rule of law itself.”

Related Items:

China Passes Strict New Data Privacy Law

Data Privacy in the Crosshairs

Apple Puts a ‘Neural Engine’ Inside the iPhone

 

Datanami