Digital Services Act Enters Enforcement Phase and New Compliance Pressures Emerge

Two years on, the Digital Services Act is reshaping platform accountability, with its first major fine, new enforcement actions, and evolving guidance highlighting stricter obligations on content moderation, transparency, and user protection.

The Digital Services Act (the “DSA”) regulates online intermediaries and platforms, social networks, marketplaces and content sharing platforms (“Online Platforms”) and was signed into law on 17 February 2024. The main objective of the DSA is to ensure a safer digital space, improve platform accountability, address illegal content and harmful activities online and stop the spread of misinformation to create a fairer and safer online environment.

The DSA introduces a tiered regime, with enhanced obligations applying to “Very Large Online Platforms” and “Very Large Online Search Engines” (VLOPs / VLOSEs), including risk assessments, independent audits and additional transparency requirements.

The DSA has now been in force for over two years, and during that period the following has occurred:

“Trusted Flagger”

On 2 April 2025, the Central Bank of Ireland (“CBI”) was appointed as Ireland’s first “trusted flagger” by Coimisiún na Meán. This designation enables the CBI to identify, detect and notify prohibited online financial scams, financial fraud and illegal services, requiring Online Platforms to prioritise and process such notices without delay.

Online Platforms are required to prioritise CBI reports and remove illegal content without undue delay, increasing the speed at which fraudulent advertisements are taken down to better protect users. Article 22 of the DSA provides that entities can be appointed a trusted flagger if:

  • It has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content.
  • It is independent from any provider of Online Platforms.
  • It carries out its activities for the purposes of submitting notices diligently, accurately and objectively.

The CBI has been appointed for a three-year term.

Draft Guidelines

On 12 September 2025, the European Data Protection Board (the “EDPB”) published its draft Guidelines on the interplay between the DSA and the GDPR (the “Draft Guidelines”). The Draft Guidelines are available here and seek to clarify how Online Platforms should interpret and apply the GDPR when processing personal data in the context of its DSA obligations. The Draft Guidelines confirm that the DSA does not create an additional lawful basis for processing personal data, confirming that any processing must rely on article 6 of the GDPR and clarifying that the DSA does not override the GDPR, both apply in tandem. The Draft Guidelines also emphasise core GDPR principles such as data minimisation and purpose limitation, particularly in the context of recommender systems and profiling.

The consultation phase concluded in October 2025, and a final version of the Draft Guidelines is expected to be published in due course. As of the date of writing, the European Data Protection Board has not yet confirmed a publication date.

The First Fine

On 5 December 2025, X (formerly Twitter) was fined €120 million for breaching transparency requirements including deceptive design of the “blue tick” for verified users, disseminating illegal content and for the lack of transparency relating to advertising and failing to provide required data access to researchers. You can read more about our comments on the investigation into X and the European Commission’s (the “Commission”) investigation here .

Reversal of Almost 50 million Decisions Affecting Users’ Content

Article 20 of the DSA empowers platform users to challenge Online Platforms’ decisions that affect, suspend or delete their content or account. Since the introduction of the DSA over two years ago, Online Platforms have reversed almost 50 million decisions affecting users’ content or accounts. The DSA obliges Online Platforms to inform their users of the content moderation decisions they take and explain the reasons behind those decisions. The most reported violations include violation of providers’ terms and conditions, unsafe or non-compliant products and scams / fraud (further detail is available here).

Transparency Reporting

Article 15 of the DSA requires Online Platforms to publish annual reports (and in cases of VLOPs to report twice annually) explaining how they moderate online content. The Commission has provided detailed reporting templates that Online Platforms must submit, and the Commission has now aligned all reporting dates to mirror the calendar year.

Plans for 2026

The Commission has moved on from implementation and designation of the DSA and is now focusing on enforcement via investigations; it has launched investigations into Tiktok and LinkedIn’s illegal content reporting tool, X’s artificial intelligence tool Grok and into Shein for its addictive design (see our opinion here), lack of transparency and sale of illegal products and we expect further investigations to be launched.  Given that fines under the DSA can reach up to 6% of global annual turnover and considering the scale of the fine imposed on X, we expect further significant penalties to follow.

The Commission will continue to focus on the protection of minors online. The DSA prohibits the use of profiling to display targeted advertisements to children.  The Commission recently supported the launch of an open-source, privacy-preserving age verification app, allowing users to verify their age without sharing unnecessary personal details. We expect to see further measures and tools rolled out to better protect children online and create a safer online environment.


For more information, you can contact us at +353 1 662 4747, email law@hayes-solicitors.ie

Back to top