• 28-Apr-2023

Google, Meta, Apple on the radar in EU’s online content rules. What are they?

Blog Images

The rules, notified under the Digital Services Act, aim to tightly regulate the way big technology platforms moderate user content. Here is what the rules are, and how they compare with India's laws.

 

  • The European Union (EU) has confirmed the names of 19 platforms that will be subject to its landmark online content rules. Five subsidiaries of Google’s parent Alphabet, two Meta units, two Microsoft businesses, Apple’s AppStore, Twitter, and Alibaba’s AliExpress are among the entities that the EU has identified.
  • The rules notified under the Digital Services Act (DSA), aim at overhauling the EU’s social media and e-commerce rules, and at tightly regulating the way big technology platforms moderate user content.

Which are the entities that the EU has identified for subjecting to the DSA?

  • The first lot of entities identified by the EU for the strictest level of regulation under the DSA include 1) Alibaba AliExpress, 2) Amazon Store, 3) AppleAppStore, 4) Microsoft Bing, 5) Booking.com, 6) Facebook, 7) Google Play, 8) Google Maps, 9) Google Search, 10) Google Shopping, 11) Instagram, 12) LinkedIn, 13) Pinterest, 14) Snapchat, 15) TikTok, 16) Twitter, 17) Wikipedia, 18) YouTube, and 19) Zalando.

 

What are the key features of the Digital Services Act?

  • Faster removals and provisions to challenge: As part of the overhaul, social media companies will have to add “new procedures for faster removal” of content deemed illegal or harmful. They will also have to explain to users how their content takedown policy works. The DSA allows users to challenge takedown decisions made by platforms, and to seek out-of-court settlements.
  • Bigger platforms have greater responsibility:The legislation does not subscribe to a one-size fits all approach, and places increased accountability on the Big Tech companies. Under the DSA, ‘Very Large Online Platforms’ (VLOPs) and ‘Very Large Online Search Engines’ (VLOSEs) — that is, platforms having more than 45 million users in the EU, will have more stringent requirements.
  • Direct supervision by the European Commission:These requirements and their enforcement will be centrally supervised by the European Commission itself — an important way to ensure that companies do not sidestep the legislation at the member-state level.
  • More transparency on how algorithms work:VLOPs and VLOSEs will face transparency measures and scrutiny of how their algorithms work, and will be required to conduct systemic risk analysis and reduction to drive accountability about the society impacts of their products. VLOPs must allow regulators to access their data to assess compliance and allow researchers to access their data to identify systemic risks of illegal or harmful content.
  • Clearer identifiers for ads and who’s paying for them:Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the advertisement. They must not display personalised advertising directed towards minors or based on sensitive personal data.

 

How does the EU’s DSA compare with India’s online laws?

  • In February 2021, India had notified extensive changes to its social media regulations in the form of the Information Technology Rules, 2021 (IT Rules) which placed significant due-diligence requirements on large social media platforms such as Meta and Twitter.
  • These included appointing key personnel to handle law enforcement requests and user grievances, enabling identification of the first originator of the information on its platform under certain conditions, and deploying technology-based measures on a best-effort basis to identify certain types of content.
  • Social media companies have objected to some of the provisions in the IT Rules, and WhatsApp has filed a case against a requirement that requires it to trace the first originator of a message. One of the reasons that the platform may be required to trace the originator is that a user may share child sexual abuse material on its platform.
  • However, WhatsApp has alleged that the requirement will dilute the encryption security on its platform and could compromise personal messages of millions of Indians.
  • Earlier this year, with a view to making the Internet “open, safe and trusted, and accountable”, the IT Ministry notified a contentious measure by creating government-backed grievance appellate committees, which would have the authority to review and revoke content moderation decisions taken by large tech platforms.