The European Commission has accused TikTok and Meta Platforms of breaching transparency and accountability requirements set out in the Digital Services Act, known as the DSA. According to preliminary findings, both companies made it too difficult for researchers to access public data and failed to give users straightforward ways to report or appeal harmful content. These violations could result in fines reaching up to six percent of each company’s global annual revenue.
The investigation marks another major step in Europe’s effort to hold large social media platforms accountable for how they manage user data, moderate content, and shape online discourse.
The European Commission stated that the internal systems at both Meta and TikTok do not meet the DSA’s standards for openness. Researchers and oversight bodies have reported that access to public data on these platforms is restricted by complex and inconsistent processes. Without adequate access, it becomes difficult to study how these services affect public health, mental well-being, and democratic participation.
The Commission also found problems with how users can report illegal or harmful content. The existing reporting tools are not user-friendly and often lack clarity about what happens after a complaint is submitted. In some cases, users who challenged moderation decisions were not given a transparent appeals process.
Meta and TikTok have received formal notification of the preliminary findings. Both companies now have the opportunity to respond in writing and explain what steps they plan to take to correct the issues.
The European Board for Digital Services will review the case before the Commission makes a final decision. If the companies are found to have violated the DSA, they could face heavy penalties, including the maximum fine of six percent of their annual global revenue. They could also be required to make structural changes to their data access systems and reporting processes.
Why transparency rules matter
The Digital Services Act is designed to create a safer and more accountable online environment in the European Union. By requiring large platforms to provide access to public data, the law allows researchers to evaluate how algorithms influence the spread of information and how moderation systems affect user rights.
When companies make that access difficult, it becomes harder to detect disinformation campaigns or patterns of harmful content. The Commission argues that such transparency is not optional but essential to maintaining public trust in digital platforms. For users, this means clearer reporting tools, greater insight into why content is removed, and more reliable ways to challenge moderation decisions.
Impact on the tech industry
The European Union’s investigation into TikTok and Meta could set a powerful example for the rest of the technology industry. The DSA requires any large online platform operating in the EU to maintain clear procedures for data sharing, user protection, and risk mitigation. If these two companies are penalized, other firms may soon face similar scrutiny.
This case is being closely watched by both regulators and competitors. If the Commission enforces the maximum penalties, it could prompt significant changes in how global social media companies operate within Europe. It may also encourage new discussions around algorithmic transparency and user data protection worldwide.
What TikTok and Meta need to change
To comply with the DSA, Meta and TikTok must simplify how they allow independent researchers to access public data. Both companies could improve their compliance by creating standardized and secure data portals for verified research teams.
They also need to make reporting harmful content easier for users and ensure that appeal processes are clearly explained and consistently applied. Internal audits may be necessary to identify weaknesses in their moderation systems and to document how user reports are handled.
In addition, improving communication with regulators and civil society organizations could help rebuild public confidence. Transparency about how these companies apply the law will determine how quickly they can regain the trust of both users and policymakers.
The growing pressure for accountability
The investigation shows that even the most influential technology companies must adapt to the expectations of modern digital governance. As the European Union continues to strengthen enforcement of the Digital Services Act, Meta and TikTok now face a critical test of whether they can meet the legal and ethical responsibilities that come with their massive reach.
If they fail to comply, the financial consequences will be severe, but the reputational damage could be even greater. For users across Europe, this case represents more than just a regulatory dispute. It is a reminder that transparency, accountability, and access to information are becoming central values in how the digital world is governed.