WhatsApp suspended over 3 million Indian accounts from its platform between June and July 2021, a new compliance report from the social media platform reveals. Facebook, Instagram, and WhatsApp all released their monthly compliance reports on August 31, under Information Technology Regulations (Intermediary Guidelines and Digital Media Code of Ethics), 2021.

Total, the reports show an increase in the amount of content which the platforms reported and took action against (via their automated systems) and the number of user complaints received compared to their previous report for May and June 2021. These monthly reports show that major social media platforms like Facebook and WhatsApp try to comply with the requirements of the IT rules, as this could lead to platforms losing immunity under the IT Act 2000.

Content moderation through WhatsApp

The 3 million accounts WhatsApp blocked from its platform were done through automatic detection and in-app reports from its users. However, that number, according to WhatsApp, does not include the number of user complaints received via email and email from WhatsApp Complaints Officer in India, Paresh B.Lal.

WhatsApp received 594 complaints from users about account support, objection to suspension, product support, security and others. Of the 594 complaints, 74 actions have been taken, which the report said could mean blocking an account or restoring a previously blocked account.

The rest of the 490 were not prosecuted for one or more of the following reasons:

  • The user needed help to access their account
  • The user needed assistance to use any of WhatsApp’s features
  • The user wrote to give feedback
  • The user requested the recovery of a locked account and the request was denied.
  • The reported account did not violate Indian law or WhatsApp Terms of Service

Content moderation through Facebook and Instagram

Facebook and Instagram have jointly disclosed their user inquiries and content action numbers in another report. In almost all cases, the two had identified and acted against a larger number of problematic content compared to their previous report, and had received a high number of user complaints.

Content edited by Facebook

Here is a A breakdown of all problematic content that Facebook has taken action against, along with the percentage of such content reported by its automated systems:

Source: Facebook’s monthly compliance report

Content edited by Instagram

Unlike Facebook, Instagram doesn’t yet have a metric for spam content. Here is a breakdown of all of the other content reviewed:

Source: Facebook’s monthly compliance report

User complaints received on Facebook

In total, Facebook said it had received 1,504 complaints from users via the contact form on its website and its complaints officer in India, Spoorthi Priya. All of these were responded, however, in 1,326 cases, follow-up measures were taken to resolve them, including providing tools for the user to report content for specific violations, self-remediation processes where they can download their data, ways to remediate of problems with hacked accounts, etc.

The remaining 178 complaints were subject to a special review by Facebook, which resulted in 44 complaints being dealt with. These actions include removing a post, covering it with an alert, or disabling the account.

Here is a breakdown by subject of these complaints:

Source: Facebook’s monthly compliance report

User complaints received from Instagram

Instagram received 265 complaints from users – a sharp increase from the 36 complaints noted in its previous report. Of these, 181 users were provided with tools to resolve the issue, while the remaining 84 underwent specialized review and action was taken against 18.

Source: Facebook’s monthly compliance report

What the IT rules 2021 require

The IT rules require social media intermediaries to:

  • Proactively identify and remove content: This includes content moderation (through automated mechanisms) of posts that are defamatory, obscene, pornographic, pedophile, invasive, offensive or harassing in terms of gender and in other ways.
  • Publish regular compliance reports: These reports should be published every month and include details of complaints received, actions taken and “other relevant information”.
  • Appoint key leadership roles: Major social media intermediaries (with more than 50 registered Lakh users) must appoint a Chief Compliance Officer, Node Contact Person, and Resident Complaints Officer, all of whom must be India based and employees of the platform.
  • Deactivate content within 36 hours of an official order: The rules also require intermediaries to provide identity verification information or assist a government agency with crime prevention and investigation no later than 72 hours after receiving a legitimate order. They must also keep a record of disabled content for 180 days.

Also read:

Do you have anything to add? Post your comment and give someone a MediaNama as a gift subscription.