The European Commission (EC) has issued preliminary findings that TikTok and Meta’s Facebook and Instagram have failed to meet transparency obligations required by the Digital Services Act (DSA).
Its assessment indicates both TikTok and Meta did not grant researchers adequate access to public platform data. Under the DSA, such access is required to support scrutiny of online risks and content moderation.
Access deeper industry intelligence
Experience unmatched clarity with a single platform that combines unique data, AI, and human expertise.
The EC stated that the processes put in place by Facebook, Instagram, and TikTok make it difficult for researchers to obtain reliable or complete data.
According to EC, this impacts research into user exposure, including that of minors, to illegal or harmful content.
The DSA mandates platforms to provide access to public data for independent research as a mechanism for ensuring accountability in the digital ecosystem.
The EC has emphasised that access to such data is central “as it provides public scrutiny into the potential impact of platforms on our physical and mental health.”
US Tariffs are shifting - will you react or anticipate?
Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.
By GlobalDataAdditionally, its findings focus on Meta’s compliance with obligations to implement straightforward mechanisms for users to notify illegal content.
The current systems on Facebook and Instagram reportedly require users to navigate complex processes and extra steps to report issues such as child sexual abuse material or terrorist content.
The EC also noted that both platforms have incorporated ‘dark patterns’ in their interfaces relating to content reporting, which it describes as potentially confusing and discouraging for users attempting to flag content.
Furthermore, the EC stated that Meta’s systems for flagging and removing illegal content may, therefore, be ineffective.
The DSA stipulates that such mechanisms are essential for enabling EU users and trusted flaggers to report content that may breach legal standards.
The EC has also outlined shortcomings in Meta’s appeal processes related to content moderation.
Current procedures for contesting content removals or account suspensions on Facebook and Instagram do not allow users to provide additional explanations or evidence in support of their appeals.
This restriction may limit users’ ability to challenge moderation decisions effectively.
These preliminary findings are based on ongoing investigations and collaboration with Ireland’s Digital Services Coordinator, Coimisiún na Meán.
In addition, the EC has clarified that these are initial conclusions and do not represent final determinations.
EC Tech Sovereignty, Security and Democracy executive vice president Henna Virkkunen said: “Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny.
“The DSA makes this a duty, not a choice. With today’s actions, we have now issued preliminary findings on researchers’ access to data to four platforms. We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society.”
TikTok and Meta have an opportunity to review the investigation files and respond in writing. They may also implement measures to address the identified breaches.
The European Board for Digital Services will be consulted as part of the process.
If the Commission confirms the breaches, it may issue a formal non-compliance decision. This could result in fines up to 6% of the provider’s global annual turnover and the imposition of periodic penalty payments.
From 29 October 2025, researchers will gain new rights to access non-public data from major online platforms and search engines under a delegated act, expanding the scope of digital transparency and risk assessment.
