While Apple has stated that it previously rejected government requests that compromise user privacy, the company’s track record in China suggests the opposite.

Apple will reject requests by governments to use its child sexual abuse detection system (CSAM) on non-CSAM images, the company said in a report Supporting documents released this week.

Why is it important? Last Thursday, Apple announced a controversial plan to proactively scan iPhone users’ photos uploaded to iCloud for known CSAM and alert law enforcement agencies if a user’s iCloud Photos library contains extensive CSAM content. Apple’s plans have been heavily criticized by privacy officials, with many arguing that governments can ask Apple to use the same technology to censor other types of content, including suppressing voices critical of the government. Apple’s response not to allow this offers some security but does not go far enough to guarantee it.

“This is a surveillance system built and operated by Apple that could very easily be used to scan private content for anything they or a government want to control. Countries selling iPhones have different definitions of what is acceptable. ”- WhatsApp CEO Will Cathcart

To learn more about how CSAM detection technology works and what are some of the most common concerns and questions, read here. Read what industry leaders, technical experts and civil society have said here.

What exactly did Apple say?

Could governments force Apple to add non-CSAM images to the hash list?

“Apple will reject such requests. Apple’s CSAM detection feature is designed solely to detect known CSAM images that are stored in iCloud Photos and identified by experts at NCMEC [National Center for Missing and Exploited Children] and other child safety groups. We have previously challenged ourselves to create and deploy government-mandated changes that compromise user privacy and have steadfastly opposed those requests. We will continue to reject them in the future. Let’s be clear that this technology is limited to recognizing CSAM stored in iCloud and we are not going to heed any request from any government to expand it. Additionally, Apple will conduct a human review before reporting to NCMEC. In a case where the system tags photos that do not match known CSAM images, the account will not be deactivated and no report will be sent to NCMEC. ”- Apple

Can non-CSAM images be “injected” into the system to identify accounts for things other than CSAM?

“Our process is designed to prevent this from happening. The set of image hashes used for the comparison comes from known, existing images from CSAM that have been acquired and validated by child protection organizations. Apple does not add the well-known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so that targeted attacks on only certain people with our design are not possible. ”- Apple

Does this resolve the concern?

Apple’s statements in an official document offer some assurance that the technology will not be misused, but they don’t go far enough for one simple reason: It has previously compromised data protection to appease governments and law enforcement agencies.

Apple’s compromises in China: Although Apple poses as an advocate of privacy, Apple has risked its customers’ data and supported state censorship in China to appease the authorities there. Some of the tradeoffs are:

  • Storage of customer data on servers of the Chinese government
  • Use of various encryption technologies for customer data
  • Passing on customer data to the Chinese government
  • Proactive removal of apps that could offend Chinese authorities
  • Approval of almost all app deactivation requests from the Chinese government

Apple dropped its iCloud backup encryption plan: Earlier this year, Reuters did reported that Apple dropped plans to fully encrypt iCloud backups after the FBI complained that the move would affect the investigation. “It shows how willing Apple was to help US law enforcement and intelligence agencies despite the tougher high-profile litigation with the government and presenting itself as the defender of its customers’ information,” the report said.

What if governments make it a law? Many of the compromises Apple has made in China are in part due to the laws there. India’s new information technology rules 2021, for example, require platforms to develop tools to proactively remove content that the government believes is illegal, which could only be content critical of the government. Platforms have claimed it harms privacy and freedom of expression, but if Apple can take proactive measures for CSAM, the Indian government can request that the same technology be modified to meet government demands. The government can also ask other platforms like WhatsApp to implement similar technology to find images and videos that the government has deemed illegal.

also read

Do you have anything to add? Subscribe to MediaNama and post your comment