After years of lenient regulations, major tech giants such as Meta, X (formerly Twitter), TikTok, and Google may soon find themselves facing stringent investigations and substantial fines within the European Union (EU) if they fail to adequately combat harmful content.
The European Commission will gain the authority to probe and impose fines, potentially up to 6% of their global annual revenue, on some of the most prominent social media and e-commerce platforms for non-compliance with the Digital Services Act (DSA). Additionally, the Commission could employ temporary bans against tech companies in exceptional circumstances.
A total of 19 companies had until late August to adhere to the comprehensive obligations outlined in the EU’s content moderation law. However, several digital firms have previously attracted the Commission’s attention, eliciting criticism from NGOs, politicians, and the general public for their handling of widely-used platforms.
Here are some tech companies potentially subject to the European Commission’s scrutiny:
X (formerly Twitter):
With over 112 million European users, including influential figures, Twitter, now known as X, has long been under the Commission’s radar due to its significance as a vital network. Elon Musk’s acquisition of the platform in 2022 drew further attention, as his actions and decisions sparked controversy and chaos.
In May, X departed from the EU’s voluntary charter to combat disinformation after struggling to meet its standards. This move garnered condemnation from high-ranking Commission officials and EU countries like France. Musk expressed commitment to adhering to the DSA, even volunteering the platform for a stress test, though uncertainties persist following staff cuts focused on content moderation.
Meta: Instagram and Facebook:
Following Facebook whistleblower Frances Haugen’s revelations about unsafe content on the platform and Instagram, the platforms will likely face rigorous scrutiny due to their vast user bases (257 million and 258 million monthly EU users respectively). Haugen’s insights, backed by internal documents, influenced rules recommended to the European Parliament and Commission.
Meta was previously singled out by European Commissioner for Internal Market Thierry Breton for potential harm to children and misinformation surrounding elections. The company has allocated significant resources for DSA compliance.
TikTok:
TikTok faced government restrictions over data privacy concerns and Beijing’s influence. Allegations prompted interest from lawmakers urging the DSA’s application to investigate TikTok’s algorithms. The platform’s response efforts, including a charm offensive, have not fully allayed concerns about harmful content.
Breton criticized TikTok’s practices and urged the company to accelerate DSA compliance. The platform has made efforts to align with the DSA, including measures to remove targeted advertising for teenage users.
Snapchat:
While less spotlighted, Snapchat’s popularity among youth could make it a relevant case for the Commission. The platform’s AI chatbot attracted scrutiny, prompting concerns about self-damage advice. Snapchat conducted a voluntary test for compliance and introduced safeguards for the chatbot.
Google: YouTube, Search, Play Store, Google Maps, Shopping:
Google’s wide-reaching platforms may face in-depth scrutiny. YouTube’s struggle to combat disinformation and its algorithm’s propensity for amplifying extreme content present challenges. Google Search’s influence in information dissemination underscores the need for effective content management.
Google is intensifying DSA preparation, evident in its “transparency center” opening.
Amazon and Zalando:
Amazon and Zalando challenge their designations as large online platforms, contesting additional obligations. Both face complexities tied to their roles in the market.
Several platforms with fewer users are exempt from DSA content assessment. While some platforms’ user numbers remain undisclosed, the Commission is actively engaging in discussions about possible designations.
In essence, the EU’s regulatory landscape is evolving to hold tech giants accountable for harmful content. Companies’ compliance efforts and the Commission’s actions will shape the future of digital content management within the EU.