Blog
Blogs
Meta and TikTok DSA Case: When Compliance on Paper Isn’t Enough

Meta and TikTok DSA Case: When Compliance on Paper Isn’t Enough

Meta and TikTok DSA Case: When Compliance on Paper Isn't Enough

Meta and TikTok may face penalties of up to 6% of their global earnings for breaching the EU’s Digital Services Act (DSA), but the real significance lies not in the amount, but in what triggered the penalties. In this instance, the regulator did not penalize legal non-compliance. They punished operational failure: controls that existed on paper but failed in practice.

This case applies specifically to Very Large Online Platforms (VLOPs) under the DSA, but the underlying failure affects every digital business at scale. What tripped Meta and TikTok was not defiance but a familiar mistake: they built systems that technically complied with the law, yet failed when regulators actually tested them. This is the real danger for fast-growing companies: fintechs, SaaS, healthtech, edtech, cloud infrastructure, and security providers that treat compliance as merely a matter of documentation rather than an operational reality. When regulators come knocking, they don’t want to see policies; they want to see results. They want proof that those policies actually work.

TL;DR
  • Compliance that looks good on paper will not protect you when regulators test it
  • Regulators now care about usability and operation, not just intent
  • Fintech, SaaS, healthtech, cloud, and security companies are all at risk of the same operational blind spots

What actually happened

The European Commission has preliminarily found Meta and TikTok to be in breach of the Digital Services Act. Here’s the important part: both companies had documentation, certifications, and governance frameworks in place. The problem was operational.

Meta’s failure: Regulators tested Meta’s systems like real users would. They tried to report content. They attempted to appeal decisions. They tracked how long everything took. What they discovered was a gap between promise and reality. Meta’s documentation claimed “simple and accessible” reporting and appeals mechanisms. But using the actual system was confusing and slow. Regulators identified design choices throughout the platform that made it harder, rather than easier, to lodge complaints. These aren’t just UX problems—they’re compliance failures.

Arturo Béjar, a Meta whistleblower, put it plainly in the Molly Rose Foundation report: “Meta consistently makes promises about Teen Accounts… However, through testing, we found that most of Instagram’s safety tools are either ineffective, unmaintained, quietly changed, or removed. This is not about bad content on the internet, it’s about careless product design.”

TikTok’s failure: Regulators required TikTok to grant researchers access to platform data to fulfil its DSA transparency obligations. TikTok provided a process, but it was too complicated, and the data were incomplete. On paper, TikTok had a transparency mechanism in place. In practice, researchers were unable to use it effectively. 

“Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health,” the European Commission press statement said. TikTok’s response indicated that the transparency obligations of the DSA conflict with those of the GDPR, according to a report in The Independent. “If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled,” said Paolo Ganino, a spokesperson for TikTok.

Both companies had policies. Both had frameworks. Both claimed compliance. None of it mattered because the controls didn’t work operationally. Specifically:

  • They did not build streamlined, user-friendly procedures for reporting and appeals.
  • They allowed design patterns that restricted user rights.
  • They lacked clear ownership and governance of controls.
  • They failed to test whether controls worked as documented continuously.

These are not legal oversights. There are operational and governance gaps.

End operational and governance gaps with automation

Why this matters beyond social platforms

It’s tempting to think “we are nothing like Meta or TikTok.” But the underlying risk is universal: the failure to translate regulation into real operational design.

For Fintech: You live under constant regulatory scrutiny. A confusing consent screen, a hard-to-find dispute button, or a slow fraud-reporting workflow will raise red flags with financial regulators. Fintech regulators in the US and UK are increasingly testing systems in the same way: by simulating real users, looking for gaps between documented procedures and actual operations. When compliance processes are buried inside product flows, they stop being protective and start being liabilities.

For Cloud and Infrastructure Providers: You handle sensitive data and infrastructure information that underpins the operations of other businesses. The NIS2 and DORA frameworks now mandate that you continuously demonstrate operational compliance. If audit trails are incomplete, if data lineage cannot be traced end-to-end, or if incident records are inconsistent, regulators will view these gaps as failures of transparency and accountability. The same regulatory logic that caught Meta and TikTok is beginning to apply to infrastructure providers.

For Healthtech: Regulators require patient portals to make it easy to download or delete personal information. If the process is complex or incomplete, that’s not just a UX issue—it’s a compliance failure. When regulatory obligations overlap (HIPAA, state privacy laws, DSA for EU operations), weak coordination between compliance and engineering often leads to gaps: inadequate audit trails, missing consent documentation, and incomplete data access.

For SaaS and B2B Tech: You promise audit-ready systems and strong privacy controls. Those promises fall apart when a privacy setting is buried in menus, a data export function is broken, or a data subject access request takes weeks to fulfill. These are operational failures, not policy failures. They signal to regulators that what you claim and what you deliver are misaligned.

For Edtech: You collect and process data from minors, which creates complex privacy and consent obligations. A confusing consent flow or vague privacy settings create the same usability and transparency risks that landed Meta and TikTok in trouble.

The shift in regulatory enforcement

This enforcement pattern is spreading. The US Federal Trade Commission, state attorneys general, and sector-specific regulators are increasingly testing controls the same way: Can you demonstrate that your systems work as documented? Certification alone is no longer sufficient.

What to do about it

Operationalization means that every policy has an owner, every control has a corresponding test, and every process generates evidence. It’s the difference between claiming “we review access logs monthly” and having a workflow that produces, reviews, and archives that evidence on schedule. Here’s how to close the gap:

  1. Map regulatory requirements to operational evidence. Don’t just document policies. Define what “simple,” “accessible,” “secure,” and “transparent” mean in your actual systems. How many clicks? How many minutes? What does a user actually see? Then test it.
  2. Involve product and engineering in compliance design from the start. Compliance cannot be a legal layer bolted onto a finished product. It has to be baked in. This means shared accountability between compliance, product, and engineering.
  3. Test controls like you test software. Bring in people unfamiliar with your system and ask them to complete a core compliance workflow, such as submitting a data subject access request, lodging a complaint, or updating a consent preference. Measure completion rates, time taken, and errors. Document everything. This is what regulators will do during an investigation.
  4. Track compliance through system behavior, not just audit trails. Manual compliance fails at scale. Build workflows that generate evidence automatically: confirmation of data deletion, timestamps on access requests, and logs of consent interactions.

The real takeaway: Paper-trails are insufficient if on-ground controls aren’t rock solid 

This shift is already spreading beyond social platforms. Fintech regulators in the US, healthcare inspectorates across North America and Europe, and infrastructure auditors overseeing NIS2 are all adopting the same methodology: they will test whether your controls actually work, not whether they exist on paper.

For compliance leaders, this creates a clear imperative. Organizations that treat compliance as a governance artifact rather than day-to-day due diligence will face increasing scrutiny. Organizations that embed compliance into system design and operational workflows will pass regulatory testing. Skirting compliance laws with grey product and UX decisions might prove to be a costly mistake. 

The question is not whether your organization will face regulatory scrutiny. The question is: when tested, will your controls demonstrate that they actually work?

Make compliance operational, not ornamental

FAQs

1. Isn’t this just a “VLOP problem”? Why should smaller companies care?
Not at all. The DSA may apply to very large online platforms (VLOPs) today, but the underlying failure, i.e., poor translation of legal obligations into working controls, is universal. Smaller SaaS, fintech, and healthtech firms face the same structural risk when they scale or enter regulated markets. The DSA simply makes visible what regulators everywhere are already demanding: real-world proof that systems work as described.

2. How does this connect to frameworks like SOC 2, ISO 27001, or GDPR?
Each of these frameworks expects evidence that controls operate consistently. Meta and TikTok’s issues, such as weak process design, unclear ownership, and broken data-access flows, are equivalent to failing operational testing under SOC 2 or an internal audit under ISO 27001. In GDPR terms, they breached principles of transparency and accountability. In practice, this is what happens when controls exist on paper, but are not reflected in daily system behaviour.

3. What does “operationalising compliance” actually mean?
It means that every policy has an owner, every control has a test, and every process produces evidence automatically. It is the difference between claiming “we review access logs monthly” and having a workflow that generates, reviews, and archives that evidence on schedule. Sprinto’s entire approach is built around this idea: converting static frameworks into live operational programs.

4. Could these companies have defended themselves if they had documented exceptions or trade-offs (like TikTok’s GDPR conflict)?
Yes, but only if they had a documented “regulatory balancing” framework explaining why certain obligations were limited, who approved it, and how risks were mitigated. TikTok’s argument that transparency conflicts with GDPR might have held more weight if clear governance records and data-sharing impact assessments had backed it. Documentation without reasoning is not a defence; structured justification is.

5. What should companies do differently before the next regulatory cycle hits?

Companies should: 

  1. Translate every external requirement (DSA, AI Act, DPDP, DORA, etc.) into internal control language
  2. Align product, engineering, and compliance functions around shared accountability for UX, data, and transparency
  3. Audit usability of compliance workflows, including reporting, appeals, and data requests, just as rigorously as you audit code or security
  4. Automate evidence collection wherever possible, as manual compliance fails under scale

6. What is the real takeaway for compliance leaders from the Meta and TikTok DSA case of October 2025?
The Digital Services Act investigation into Meta and TikTok shows that regulatory enforcement has entered a new phase. It is no longer about whether companies have compliance frameworks, but whether those frameworks work in practice.

Meta failed because its user-reporting and appeals systems were difficult to use, violating the DSA’s requirement for “simple and accessible mechanisms.” TikTok fell short because its researcher data-access tools were too complex and produced incomplete results, breaching the DSA’s transparency obligations.

The real takeaway for compliance leaders is that proof of function now outweighs policy existence. Regulators across the EU, UK, and US are converging on the same expectation: that companies can demonstrate their controls operating continuously and producing reliable evidence. Documentation alone is no longer sufficient.

For teams responsible for GDPR, ISO 27001, SOC 2, HIPAA, or DORA compliance, this means aligning governance, UX design, and engineering operations so that every policy is traceable to a measurable, verifiable control. In other words, compliance must be embedded within your systems, not just in your handbooks and policies.

Raynah

Raynah

Raynah is a content strategist at Sprinto, where she crafts stories that simplify compliance for modern businesses. Over the past two years, she’s worked across formats and functions to make security and compliance feel a little less complicated and a little more business-aligned.

Tired of fluff GRC and cybersecurity content? Subscribe to our newsletter and get detailed
research & insights curated to help you earn a seat at the table.
single-blog-footer-img