The European Union has opened formal proceedings against social media giants Meta and TikTok, accusing them of widespread transparency breaches under its landmark Digital Services Act (DSA). This move signals a significant escalation in the EU’s efforts to hold big tech accountable, with potential fines reaching billions of dollars and profound implications for user privacy, content moderation, and platform design across the bloc.
On October 24, 2025, the European Union delivered a clear message to the world’s largest social media platforms: transparency is not optional. The European Commission officially accused Meta (operators of Facebook and Instagram) and TikTok of failing to comply with key provisions of the Digital Services Act (DSA). This legal action could lead to substantial financial penalties and reshape how these platforms operate, particularly concerning user data, content moderation, and advertising practices.
Understanding the Digital Services Act (DSA)
The Digital Services Act, which came into force last year, represents the EU’s pioneering framework for online transparency and accountability. It’s designed to ensure that major online platforms take greater responsibility for the content distributed on their services and how their systems impact users. The DSA’s core objectives include:
- Combating illegal and harmful content effectively.
- Opening algorithms to public scrutiny.
- Ensuring researchers can access data to study platform effects.
- Protecting fundamental user rights online.
As Henna Virkunnen, the EU’s executive vice president for tech sovereignty, security, and democracy, stated on X, “Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.” For more details on the legislation, you can refer to an Associated Press overview of the Digital Services Act.
Key Accusations Against Meta and TikTok
The European Commission’s investigation, which began in 2024, highlighted several critical areas where Meta and TikTok allegedly fell short of DSA requirements:
- Inadequate Data Access for Researchers: Both companies are accused of failing to provide independent experts with “adequate access to public data.” This crucial requirement allows researchers to monitor issues like the spread of misinformation and children’s exposure to harmful content, which is vital for understanding the societal impact of these platforms.
- “Dark Patterns” and Content Reporting: For Meta’s Facebook and Instagram, regulators pointed to “deceptive interface designs,” commonly known as “dark patterns.” These designs make it confusing or discouraging for users to flag illegal content, such as child sexual abuse or terrorist material, or to challenge content moderation decisions effectively. This obfuscation makes user action “ineffective,” according to the Commission.
- Lack of Detailed Appeals Processes: Users whose content is removed or accounts suspended by Meta’s platforms reportedly lack a clear and detailed way to appeal these moderation decisions, undermining their right to fight back against unilateral actions by big tech, as EU digital spokesman Thomas Regnier emphasized.
- Unclear Advertising and Data Collection Practices: The EU also expressed concerns that neither TikTok nor Meta has adequately disclosed how they target advertisements to specific audiences, particularly children and teenagers. This lack of transparency extends to their data collection methods, making it difficult for users to understand how their personal information is being handled, raising significant privacy and data security issues.
The Companies’ Responses and the GDPR Conundrum
Both companies have issued statements in response to the EU’s accusations. A Meta spokesperson, Ben Walters, stated that the company “disagrees with any suggestion that we have breached the DSA,” pointing to recent updates made to content reporting tools, appeals processes, and data access systems since the law’s inception. Meta insists these changes meet EU requirements and will continue to engage with regulators.
TikTok, owned by Chinese parent company ByteDance, affirmed its commitment to transparency. However, a spokesperson, Paolo Ganino, raised a significant point of contention: the potential conflict between the DSA’s data sharing obligations for researchers and the EU’s stringent General Data Protection Regulation (GDPR). “If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled,” Ganino said. This highlights a complex regulatory challenge, as the EU’s privacy rules under GDPR are among the world’s toughest.
Long-Term Impact for Users and the Tech Landscape
For the millions of users across Europe, these developments signify a push toward a safer and more transparent online experience. The EU’s commitment is clear: “We want our citizens to feel safe while using digital services,” an EU spokesperson noted. The implications are profound:
- Enhanced User Control: If the EU’s actions succeed, users could see more straightforward ways to report harmful content and challenge moderation decisions, empowering them in disputes with platforms.
- Improved Data Privacy: Greater transparency around advertising targeting and data collection, particularly concerning children, could lead to better privacy safeguards and more informed consent.
- Accountability for Algorithms: Increased data access for researchers means a clearer understanding of how algorithms amplify certain content and affect mental health, potentially leading to safer platform designs.
- Massive Financial Penalties: Failure to comply could result in fines of up to 6% of a company’s global annual turnover, which could amount to billions of dollars for these tech giants. This financial pressure serves as a strong incentive for compliance.
This situation also reflects a broader global trend of increasing scrutiny on major social media players. Governments worldwide are intensifying efforts to hold big tech accountable for adhering to stricter regulations concerning advertising, data collection, and content moderation processes. For the fan community and advocates of open, responsible technology, the DSA’s enforcement against Meta and TikTok is a critical step towards digital environments that prioritize user safety and democratic values over unchecked corporate interests.