“This directive marks the moment when China’s technology regulation not only keeps pace with data regulations in the EU, but goes beyond them,” says Kendra Schaefer, researcher at Trivium China. tweeted in response to draft recommendation algorithms regulations published by the Cyberspace Administration of China on Aug. 27.

China has heavily regulated large parts of its economy over the past year, particularly the technology sector. It has launched and reprimanded investigations against large corporations like Alibaba, Tencent, Didi, ByteDance, and others, and recently passed the Personal Information Protection Law (PIPL), one of the toughest privacy laws in the world. China’s tough crackdown is affecting not only the country, but the world as a whole. His approach to regulating the tech industry will shape the approach of other countries.

Here are the key takeaways from the country’s latest draft regulations on recommendation algorithms.

The following content is based on unofficial translations of the original chinese version provided by Stanford’s DigiChina Center and Translate Chinese Law.

What are the goals of the new regulations?

According to Article 2, the rules for recommendation algorithms apply within China, for example for algorithms of the following types: generative or synthetic, personalized recommendation, ranking and selection, search filters as well as dispatch and decision-making.

Responsibilities of companies using recommendation algorithms

  • Should be used to spread positive energy and forever: Article 6 of the bill states that companies must employ recommendation algorithms to maintain mainstream value orientations, optimize algorithmic recommendation service mechanisms, vigorously spread positive energy, and drive the use of algorithms up and in the direction of good.
  • Shouldn’t use illegal or discriminatory user tags: Article 10 states that companies should not enter unlawful or harmful information as user interest keywords or turn it into user tags for use as a basis for recommending information content, and they must not set up discriminatory or biased user tags.
  • Businesses need to be able to detect and stop harmful or illegal content: Article 9 requires companies to improve their means of identifying illegal and harmful content and to stop broadcasting such content as soon as it is discovered. Companies should also keep relevant records and report them to appropriate government agencies.
  • Labeling of algorithmically generated content: Article 9 requires that algorithmically generated or synthetic information may only be disseminated if it is marked as such.
  • Content in important sections such as front pages should correspond to the current value orientations: In addition to establishing and perfecting mechanisms for manual intervention and autonomous user selection, in Article 11 companies must present information that “reflects the mainstream value orientations in key segments such as front pages and main screens, hot search terms, selected topics, topic lists, pop-up windows, etc. “
  • Should be transparent and understandable: Article 12 states that companies must make recommendation algorithms for search results, rankings, selections, push notifications and other such use cases transparent and understandable in order to exert a harmful influence on users or to spark controversy or disputes.
  • Be clear about why and how recommendation algorithms are used: Article 14 states that companies must clearly inform users of the circumstances of the algorithmic recommendation services they provide and publicize the rationale, purposes, operating mechanisms, etc. of those algorithmic services.
  • Algorithms for minors should be adjusted to take this into account: Article 16 states that recommendation algorithms that provide services to minors must protect minors in accordance with the law and enable minors to obtain information that is beneficial to their physical and mental health through models suitable for minors.
  • Algorithms must protect workers’ rights: Article 17 is aimed at algorithms that determine work plans, such as: For example, that of delivery workers for food delivery platforms, and states that such algorithms should function to protect the workers’ rights and interests.
  • Algorithms must protect consumer rights and must not be treated inappropriately differentiated: Article 18 states that companies that use recommendation algorithms to sell products or provide services to consumers must protect the legitimate rights and interests of consumers and must not use algorithms to inappropriately differentiate conditions such as transaction prices based on consumer prices . Tendencies, trading habits and other such characteristics.
  • Obligations of providers of recommendation algorithms with public opinion characteristics or with skills for social mobilization: Article 20 states that providers of recommendation algorithms with characteristics of public opinion or with social mobilization skills must specify the name of the provider, the type of service, the application domain, the algorithm type, the algorithm self-assessment report, the content intended for publication, and others should report such information within 10 business days of the provision of the services. Article 23 states that these providers must also carry out a safety assessment in accordance with the relevant government regulations.
  • Personal data of users must be protected: Article 25 states that affiliates and employees involved in the security assessment, monitoring and inspection of recommendation algorithms must keep the personal data, private information and trade secrets they experience as strictly confidential and must not disclose, sell or unlawfully make them available to others Persons.

What should recommendation algorithms not be used for?

  • Should Not Lead User to Addiction or Quality Consumption: Article 8 states that companies regularly use algorithmic mechanisms, models, data and application results etc. for addiction or high-quality consumption.
  • Should not be used to harm national security or social order: Article 6 also states that companies that use recommendation algorithms should not engage in activities that harm national security, disrupt economic and social order, violate the lawful rights and interests of other persons, and other such acts as required by laws and administrative regulations are forbidden, should not take.
  • Algorithms should not be used to create fake accounts, likes, comments: Article 13 states that companies cannot use algorithms to create fake accounts, fake likes, comments, re-shares, etc.
  • Algorithms should not be used to manipulate results: Article 13 also states that companies must not use algorithms to block certain information, recommend, manipulate related lists or search results, or control hot search terms or selections. Companies should also refrain from preferential treatment, unfair competition, influencing online public opinion, or escaping the control of algorithms.
  • Should minors not lead to harmful tendencies or online addition: Article 16 states that algorithms aimed at minors should not encourage minors to mimic unsafe behavior, engage in acts that violate social norms, or lead minors to harmful tendencies that affect their physical or mental health. Algorithms should not lead minors to online addiction either.

What rights are granted to users by the draft ordinances?

  • Users must be given control over what information algorithms use: Article 15 states that users must be given the option not to influence their individual characteristics through recommendation algorithms and that they must also select, revise or delete user tags that are used by these algorithms.
  • Users must be able to opt out of algorithmic recommendation services: Article 15 states that companies must provide users with a convenient way to turn off algorithmic recommendation services and immediately stop providing such services if the user so chooses.
  • Users can remedy the situation if algorithms have a significant influence on their rights and interests: Article 15 states that if recommendation algorithms have a major influence on their rights and interests of users, these users have the right to request an explanation and measures to improve or remedy the situation from the company using these algorithmic services.
  • Users must be able to conveniently file complaints: Article 26 states that companies must provide social oversight and put in place convenient systems for users to submit complaints, and such complaints must be dealt with promptly. Users must also be able to object and receive feedback.

Role of government

  • Graduated classification of companies that provide recommendation algorithms: Article 18 states that the government will put in place a categorized and tiered management system of companies that will set recommendation algorithms based on the characteristics of public opinion or social mobilization ability, the types of content, the size of users and the sensitivity of the data processed by the algorithmic process provide recommendation technology, degree of intervention in user behavior, etc.
  • Companies should cooperate with government assessments: Article 24 states that relevant competent government agencies must conduct a security assessment, monitoring and inspection of recommendation algorithms and make suggestions for correcting any problems discovered and set a deadline for correcting them. Companies must work with the departments responsible for this work and provide the necessary technology, data, support and assistance. Article 23 also states that companies must keep logs for a period of at least 6 months.
  • Penalties for violations: Depending on the article violated, the penalty can range from a mere warning and ordering rectification to a fine of up to 30,000 yuan and suspension of service. In the event of violations of other laws, such as the law on the protection of personal data, companies will be prosecuted in accordance with this law.

also read

Do you have anything to add? Post your comment and give someone a MediaNama as a gift subscription.