The European Commission has issued preliminary findings against two of the world’s largest social media platforms, Meta and TikTok, alleging significant breaches of the Digital Services Act (DSA). These charges underscore the European Union’s assertive stance on regulating online content and platform accountability, signaling a new era of digital governance that demands greater transparency and user protection from tech giants. The preliminary assessment indicates that both companies have failed to provide adequate access to public data for independent researchers and have implemented user interfaces that complicate the reporting of illegal content and the appeal of moderation decisions.
The Core Allegations: Transparency and User Experience
At the heart of the Commission’s concerns lies the accessibility of platform data for academic and independent researchers. The EC’s investigation found that both Meta, the parent company of Facebook and Instagram, and TikTok have established "burdensome" procedures and tools for researchers attempting to access public data. This bureaucratic hurdle often results in researchers receiving incomplete or unreliable data, severely impeding their ability to conduct essential studies. Such research is crucial for understanding systemic risks, including the exposure of users, particularly minors, to illegal or harmful content, as mandated by the DSA. The inability to thoroughly investigate these areas leaves a critical gap in public understanding and oversight of platform impacts.
Beyond data access, Meta platforms—Facebook and Instagram—face additional specific accusations related to user experience and content moderation. The Commission alleges that Meta has not provided EU residents with sufficiently simple and intuitive mechanisms to report illegal content. Instead, users are reportedly subjected to multiple unnecessary steps, a practice the EC describes as "dark patterns." These design tricks are manipulative interface elements intended to steer users toward specific actions or deter them from others, in this case, making it difficult to flag problematic content effectively. The Commission warned that such practices could be "confusing and dissuading," ultimately rendering Meta’s mechanisms for flagging and removing illegal content largely ineffective.
Furthermore, Meta’s content moderation appeal mechanisms were also cited as non-compliant. The preliminary findings suggest that these systems do not allow EU residents to fully explain their grievances or provide supporting evidence when appealing a content decision. This limitation significantly curtails the effectiveness of the appeals process, making it challenging for users to contest platform rulings and seek redress, thereby undermining fundamental rights to due process in the digital sphere.
The Digital Services Act: A New Regulatory Era
To understand the gravity of these findings, it is essential to grasp the foundational principles and ambitious scope of the Digital Services Act. Enacted by the European Union, the DSA represents a landmark piece of legislation designed to create a safer and more accountable online environment. It came into full effect for Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) in August 2023, applying to platforms with over 45 million active users in the EU—a threshold that both Meta and TikTok comfortably exceed.
The DSA was born out of growing concerns over the unchecked power of digital platforms, which had increasingly become vectors for misinformation, hate speech, cyberbullying, and other forms of harmful content. Before the DSA, self-regulation by tech companies often proved insufficient, leading to calls for robust, legally binding oversight. The Act aims to achieve several key objectives:
- Protect Fundamental Rights: Safeguarding freedom of expression, privacy, and other fundamental rights online.
- Combat Illegal Content: Establishing clear rules for content moderation and removal.
- Enhance Transparency: Mandating platforms to be more open about their algorithms, content moderation practices, and data handling.
- Empower Users: Giving users more control over their online experience and clearer avenues for recourse.
- Systemic Risk Mitigation: Requiring VLOPs and VLOSEs to identify, analyze, and mitigate systemic risks arising from their services, such as the spread of disinformation or negative impacts on minors.
The requirement for researcher data access is a cornerstone of the DSA’s transparency mandate. Regulators believe that independent academic scrutiny is indispensable for understanding the real-world impact of these platforms, especially concerning vulnerable populations and democratic processes. Without reliable data, policymakers and civil society organizations struggle to formulate effective interventions or hold platforms truly accountable.
Context of the Investigations
The preliminary findings against Meta and TikTok are the result of ongoing investigations initiated by the European Commission earlier in 2024. These probes signify the EU’s proactive enforcement strategy, demonstrating a commitment to ensuring that the DSA’s provisions are not merely theoretical but practically applied and rigorously monitored.
The investigation into TikTok commenced with a broad focus on several critical areas, including advertising transparency, researcher data access, content moderation policies, and measures for the protection of minors. Concerns had been mounting regarding the platform’s potential impact on young users, the virality of harmful trends, and the opaque nature of its recommendation algorithms.
Similarly, the probe into Meta was launched amid suspicions that Facebook and Instagram were failing to uphold their obligations as large platforms concerning election integrity. This concern became particularly acute in the context of upcoming European elections, highlighting the DSA’s role in safeguarding democratic processes from online manipulation and disinformation campaigns. The current findings against Meta regarding reporting mechanisms and appeals processes contribute to this broader picture of alleged non-compliance in critical areas.
Industry Responses and Regulatory Tensions
Both Meta and TikTok have responded to the European Commission’s preliminary findings, largely disputing the allegations while acknowledging their commitment to compliance.
TikTok, through a spokesperson, stated that it has made "substantial investments" in data sharing and has reportedly granted data access to nearly 1,000 research teams through its dedicated tools. However, the company also articulated a perceived tension between the DSA’s demands for data access and the stringent data protection requirements of the EU’s General Data Protection Regulation (GDPR). "If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled," the spokesperson noted. This highlights a complex legal and technical challenge for platforms, as they navigate competing regulatory priorities: the public interest in transparent research versus the individual’s right to privacy. Reconciling these two powerful legislative frameworks will likely be a significant point of contention and discussion moving forward.
Meta, likewise, expressed disagreement with any suggestion of breaching the DSA. A Meta spokesperson affirmed that the company has introduced changes to its content reporting options, appeals process, and data access tools since the DSA came into force within the European Union. "We are confident that these solutions match what is required under the law in the EU," the spokesperson stated, indicating ongoing negotiations with the European Commission on these matters. Meta’s assertion suggests a belief that their current measures are sufficient, despite the EC’s preliminary assessment.
These responses underscore the intricate dance between powerful tech corporations and increasingly assertive regulators. Companies argue they are making efforts, while regulators insist on concrete, verifiable compliance that genuinely achieves the legislative intent.
Broader Implications: Market, Society, and Future of Digital Governance
The European Commission’s actions against Meta and TikTok carry significant implications across market, social, and cultural spheres, potentially shaping the future of digital governance globally.
Market Impact: For the tech giants, the financial stakes are enormous. Penalties for confirmed breaches of the DSA can be severe, reaching up to 6% of a company’s global annual revenue. Given the multi-billion-dollar revenues of Meta and TikTok, such fines could amount to billions of dollars, serving as a powerful deterrent. Beyond direct fines, compliance costs—investing in new tools, hiring more staff for moderation and data access, redesigning interfaces—are substantial. These regulatory pressures could also influence market dynamics, potentially encouraging smaller, more compliant platforms or forcing larger players to fundamentally rethink their operational models, not just in Europe but globally, as they often adopt uniform standards. The EU’s proactive enforcement also sets a precedent that other jurisdictions, such as the UK with its Online Safety Bill or various states in the US exploring similar legislation, might follow.
Social Impact: The allegations directly address core societal concerns. Inadequate data access for researchers means less independent scrutiny of how platforms influence public discourse, mental health, and democratic processes. If researchers cannot study how algorithms promote certain content or how "dark patterns" manipulate user behavior, the public remains largely in the dark about the true societal costs of digital platforms. Effective content reporting and appeal mechanisms are crucial for protecting users from illegal content—ranging from hate speech and child sexual abuse material to consumer fraud—and for ensuring that legitimate speech is not unfairly censored. The DSA’s enforcement is thus vital for fostering a healthier, more trustworthy online environment.
Cultural Impact: The push against "dark patterns" reflects a broader cultural shift towards valuing user autonomy and transparent digital interactions. For too long, platform design has often prioritized engagement and advertising revenue over user well-being and clear communication. The DSA seeks to rebalance this, pushing platforms to respect user agency. The debate over content moderation and appeals mechanisms also touches upon fundamental questions of free speech, censorship, and who ultimately decides what is acceptable online. The EU’s approach aims to establish a framework where platforms are accountable stewards of digital spaces, rather than unchecked arbiters of content.
Looking Ahead: Potential Penalties and the Path to Compliance
These findings are preliminary, meaning the process is still ongoing. Both Meta and TikTok will have the opportunity to review the European Commission’s investigation documents, formally challenge the findings, and propose concrete commitments to address the identified shortcomings. This phase allows for further dialogue and potential adjustments before any final decisions or penalties are imposed.
However, the EC’s clear communication signals a firm resolve to enforce the DSA rigorously. The message is unequivocal: compliance is not optional. The outcome of these investigations will not only determine the immediate financial and operational impact on Meta and TikTok but will also serve as a critical test case for the DSA’s effectiveness and the EU’s broader vision for a more regulated and responsible digital future. It underscores that the era of largely unfettered operation for tech giants is drawing to a close, at least within the European Union’s jurisdiction.




