The European Commission said Friday that Meta Platforms and TikTok have breached transparency obligations under the European Union's Digital Services Act, accusing both companies of failing to give researchers adequate access to public data and warning that violations could result in fines of up to 6% of their global annual revenue.

In preliminary findings, the EU executive said Meta's Facebook and Instagram, along with TikTok, had imposed "burdensome procedures and tools" that left researchers with "partial or unreliable data," hampering efforts to assess whether users - including minors - are being exposed to illegal or harmful content. The Commission also found that Meta failed to offer users "simple mechanisms" to report illegal material or challenge moderation decisions, calling the company's interfaces "confusing and dissuading."

"Allowing researchers access to platforms' data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health," the Commission said in its statement.

Meta and TikTok pushed back on the findings. "We disagree with any suggestion that we have breached the DSA, and we continue to negotiate with the European Commission on these matters," Meta spokesperson Ben Walters said. "In the European Union, we have introduced changes to our content reporting options, appeals process, and data access tools since the DSA came into force and are confident that these solutions match what is required under the law."

A TikTok spokesperson said the company "is committed to transparency and values the contribution of researchers," noting that "almost 1,000 research teams have been given access to data through our Research Tools to date." The spokesperson added that the company was reviewing the Commission's findings but argued that "requirements to ease data safeguards place the DSA and GDPR in direct tension. If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled."

The preliminary findings form part of the EU's sweeping enforcement of the Digital Services Act, which came into effect in 2024 to hold major online platforms accountable for the spread of misinformation, harmful content, and opaque data practices. The law requires social media giants and search engines to implement robust mechanisms for content moderation, transparency, and user protection.

If the Commission's view is upheld after consultation, both companies could face significant financial penalties. For Meta and TikTok's parent company ByteDance, the potential fine could reach billions of euros given their global scale. The companies have been invited to examine the findings and submit written responses before the Commission issues a final ruling.

The scrutiny adds to a series of EU actions against Big Tech firms under both the Digital Services Act and the Digital Markets Act. In April, the Commission fined Meta €200 million for violations related to user consent in data collection, while Ireland's Data Protection Commission fined TikTok €530 million earlier this year for transferring data to China.