In the petition filed by WhatsApp against the Indian government on the IT rules 2021, the messaging platform detailed how it deals with child sexual abuse content without breaking the end-to-end encryption.
Implementing end-to-end encryption means that only the sender and recipient can decrypt and see the content of messages.
It’s worth noting that an ad hoc committee had recommended the Rajya Sabha Committee that law enforcement agencies should be allowed to break the end-to-end encryption in order to track down abusers. As a result, IT rules 2021 have mandated that all major social media intermediaries such as WhatsApp must enable tracing of the originator of information they classify as child sexual abuse material.
However, WhatsApp has long claimed that it has no tolerance for sharing child sexual abuse material on its platform, which begs the question how WhatsApp deals with child sexual abuse material when it cannot access the content of a message.
In the case of content about child sexual abuse, WhatsApp relies on available unencrypted information including user reports, profile photos, group photos, group subject and descriptions to identify and prevent abuse.
- WhatsApp had previously mentioned it too uses photo matching technology called PhotoDNA to proactively scan profile photos for images of child abuse.
- When WhatsApp recognizes every picture of child sexual abuse on its unencrypted surfaces, it removes the image and locks the user and associated accounts within a group.
- WhatsApp too shares the picture along with the associated account details to the National Center for Missing and Exploited Children (NCMEC). NCMEC in turn provides India’s National Crime Records Bureau with access to India-specific reports over a secure VPN (Virtual Private Network) connection.
- WhatsApp also provides a monthly report with the NCMEC report IDs for Indian users to the National Crime Records Bureau of India.
What do the IT rules say?
Regarding child sexual abuse material, the IT rules 2021, which went into effect on May 25, require all social media intermediaries to:
- Instruct users not to disclose information that is harmful to a child
- Remove child sexual abuse content within 36 hours of a court order or notified by appropriate government agencies
- Remove content of child sexual abuse within 24 hours if reported by the victim or by an individual on behalf of the victim as part of the complaints process
Major social media intermediaries like WhatsApp must also:
- Enable tracing of the originator of information classified as child sexual abuse material
- Build automated tools to proactively detect child sexual abuse content and notify users who try to access or share it
- Publish a regular monthly compliance report detailing the complaints received and the actions taken to address them