The European Commission said Friday that Meta (META.O) and TikTok violated their transparency obligations under the Digital Services Act (DSA) by failing to provide researchers with sufficient access to public platform data — a key requirement designed to allow oversight of harmful content.
The preliminary findings come as Brussels intensifies its scrutiny of Big Tech, enforcing the DSA’s mandate for large online platforms to provide robust mechanisms to mitigate illegal and harmful material, including misinformation, child exploitation, and extremist content.
According to the Commission, Meta’s Facebook and Instagram platforms lack a “user-friendly and easily accessible” system for reporting such content, while both Meta and TikTok have implemented “burdensome procedures” that discourage researchers from accessing public data. The Commission said Meta’s “deceptive interface designs” could confuse users and “dissuade” them from reporting problematic posts, weakening transparency and accountability.
Meta and TikTok disputed the allegations. A Meta spokesperson said the company had already made “substantial updates” to its reporting tools and data access systems since the DSA took effect, while TikTok said it remained committed to transparency but warned that relaxing data safeguards could conflict with the EU’s General Data Protection Regulation (GDPR), which governs privacy and data protection.
The Commission stressed that the companies have the opportunity to respond and remedy the alleged breaches before a final ruling is made. If confirmed, both could face fines of up to 6% of their annual global turnover — potentially billions of euros.
The case underscores the EU’s growing determination to test the reach of its new digital rules, which aim to make online platforms more accountable for how they handle user data, manage content, and protect public discourse.




