On 3 April 2026, the temporary derogation to the ePrivacy Directive — the legal basis that has for years allowed technology companies to voluntarily detect and report child sexual abuse material (CSAM) on their platforms — is set to expire. Negotiations between the European Parliament and the Council on extending the derogation broke down on 26 March 2026 as the Parliament voted against prolonging the interim derogation after various attempts to agree an extension. Due to the political deadlock surrounding the EU's permanent regulatory framework for combating child sexual abuse online, providers are now left without legal cover for detection and reporting activities in interpersonal communication services that have been standard industry practice for a number of years.
The ePrivacy derogation: legal certainty at risk
The derogation was always intended as a temporary bridge until the EU adopted a permanent framework. The practical consequences of the derogation's expiry will be significant. The experience of late 2020 is instructive: when comparable legal uncertainty arose over whether voluntary detection was permitted under EU law, reports of CSAM from EU-based accounts to the US National Center for Missing and Exploited Children (NCMEC) dropped by 58% in just 18 weeks. A repeat of this scenario would severely disrupt established reporting channels and undermine the ability of law enforcement to identify and rescue victims. For providers, operating voluntary detection systems in the context of interpersonal communications without this clear legal basis moves severe compliance risks from theoretical to operative.
The imminent expiry is all the more consequential because the EU has been building a comprehensive legal framework to tackle the growing threat of child sexual abuse online as part of its Child Safety Strategy. At its core is the conviction that online platforms, hosting providers, and interpersonal communication services have a critical role to play in creating a safer digital environment for children. To prevent the dissemination of CSAM and the solicitation of children — and to avoid a patchwork of diverging national laws — two legislative instruments have been advancing in parallel: a proposed Regulation on preventing and combating child sexual abuse (the CSA Regulation), which imposes new obligations on providers, and a recast of Directive 2011/93/EU on combating the sexual abuse and sexual exploitation of children (the CSA Directive), which updates and harmonises Member States' criminal law.
Both instruments have had a difficult path. The Commission's original 2022 proposal for the CSA Regulation drew intense criticism of its mandatory "chat control" provisions, which would have required providers to scan private communications for CSAM — a measure widely condemned as incompatible with end-to-end encryption and tantamount to mass surveillance. After years of deadlock, EU institutions appeared to be converging on a workable compromise. With the most contentious element — chat control — removed from the negotiating text, the prospects for agreement had improved significantly. The breakdown of negotiations on the ePrivacy derogation extension, however, has cast fresh doubt over the broader legislative trajectory and now threatens to leave providers in an immediate predicament.
The CSA Regulation: from “chat control” to risk-based obligations
The CSA Regulation has been in negotiation since May 2022, with progress stalling as Member States sought to balance child safety objectives against fundamental‑rights concerns. The breakthrough came when the Council presidency secured a common position at the end of 2025, removing the mandatory detection orders that had blocked consensus. Initially, the removal of obligatory chat control seemed to unlock momentum and bring the Member States closer to an agreement. The Council's position materially departs from the Commission's original text. The key elements include:
- No mandatory detection orders — for now. Mandatory scanning provisions are removed, with a review clause requiring the Commission to assess within three years “the necessity and feasibility of including detection obligations,” based on an evidence‑based evaluation of available technologies.
- Permanent voluntary scanning. The temporary ePrivacy Directive derogation allowing providers to voluntarily scan their services for CSAM would become permanent.
- Mandatory risk assessments and mitigation. Providers must analyse how their services could be misused to share CSAM or solicit children and implement proportionate mitigation measures, such as user reporting tools, content-sharing controls, and default privacy settings for children. National authorities could compel content removal or search delisting.
- Risk categorisation. Services will be classified as high, medium, or low-risk. High-risk providers can be required to contribute to the development of risk-mitigating technologies.
- EU Centre on Child Sexual Abuse. A new EU Centre will serve as a hub for expertise, reporting, and coordination between providers, Member States, and law enforcement.
- Victim support. Providers must assist victims seeking the removal of CSAM depicting them. The EU Centre will verify whether removal requests have been actioned.
- Reporting obligations. Mirroring US reporting practice to NCMEC, providers must promptly report information indicating possible CSA to the EU Centre, inform affected users, and offer straightforward, age‑appropriate reporting features.
The potential penalties are substantial. For infringements of substantive obligations — including risk assessment, mitigation, reporting, and removal duties — Member States, responsible for enforcement, can impose fines of up to 6% of annual worldwide turnover.
Beyond the penalties, providers face a further structural challenge: the proposed CSA Regulation and the DSA impose risk assessment and mitigation obligations that, in their current form, risk creating overlapping compliance regimes — particularly for very large online platforms already subject to the DSA's systemic risk rules. How these two frameworks are to be aligned — including whether CSAM-specific obligations should supplement or sit alongside existing DSA requirements — remains an open question for EU lawmakers to resolve in the trilogue negotiations.
The recast CSA Directive: updating the criminal law framework
This forward momentum extends to the criminal law dimension. Running in parallel, the Commission proposed in February 2024 a recast of the CSA Directive. While the CSA Regulation addresses provider obligations, the Directive focuses on harmonising Member States’ substantive criminal law and sets the minimum bar for criminal offences in the Member States.
An ex‑post evaluation found the 2011 Directive no longer fit for purpose, particularly given technological developments. The revised rules would:
- Expand offence definitions, covering livestreaming of child sexual abuse, possession and exchange of “paedophile manuals” as well as now explicitly covering the use of AI systems to produce CSAM. On AI-generated material, the Directive aims to proactively address the misuse of AI to create synthetically produced, lifelike CSAM (“deepfakes”), ensuring the definition of CSAM covers technological developments in a future-proof way. On paedophile manuals, the proposal targets material that provides guidance on how to find, groom, and abuse children; on how to avoid identification, investigation, and prosecution; and on how to conceal abuse material.
- Increase penalties, as well as more specific requirements for prevention and victim support.
- Extend limitation periods: The European Parliament’s LIBE Committee has proposed removing limitation periods entirely for offences covered by the Directive and allowing victims to claim compensation indefinitely.
Outlook and key considerations for providers
The imminent expiry of the ePrivacy derogation is the most pressing concern for providers — as a joint industry statement put it, allowing the derogation to lapse “is irresponsible” and would leave “children across Europe and around the world with fewer protections than they had before.” Companies that rely on the derogation as the legal basis for voluntary CSAM detection in interpersonal communication services must prepare for the possibility that, from 3 April, they will operate under significant legal uncertainty as to whether their detection activities remain compatible with EU law. Some may curtail or suspend detection activities to avoid potential conflict with the ePrivacy Directive — a development that would directly reduce the volume of abuse material reported to authorities and the capacity to identify victims. That outcome cannot be in anyone's interest.
The political dynamics around the CSA Regulation remain in flux. While chat control is off the table and trilogue negotiations continue — with further sessions scheduled for May and June 2026 — the breakdown of talks on the derogation has injected renewed uncertainty into the broader legislative process. The European Parliament is expected to vote on its negotiating mandate for the recast CSA Directive in plenary in June 2026.
Regardless of the remaining uncertainty around the timeline, providers would be well advised to start preparing now. Mandatory risk assessments and reporting obligations to the new EU Centre on Child Sexual Abuse are very likely to form part of the final framework, making it sensible for providers to begin conducting and documenting comprehensive risk assessments across their services — with particular attention to AI-powered features. Where providers have already established risk assessment processes under the DSA or other regulatory frameworks, these can serve as a valuable foundation to build on. Ensuring that AI systems cannot be used to produce or distribute CSAM is becoming an increasingly important element of responsible compliance planning under the emerging child safety framework. At the same time, EU lawmakers must pick up the pace. Continued legal uncertainty on the protection of children against sexual abuse serves no one — not providers, not law enforcement, and least of all the children these instruments are designed to protect.

/Passle/5832ca6d3d94760e8057a1b6/SearchServiceImages/2026-04-01-17-19-13-708-69cd5391705f46f2d4c680a2.jpg)
/Passle/5832ca6d3d94760e8057a1b6/MediaLibrary/Images/2026-03-31-15-31-12-469-69cbe8c03500b48c6b0da925.png)
/Passle/5832ca6d3d94760e8057a1b6/SearchServiceImages/2026-03-31-12-35-55-395-69cbbfab295ee73eda661d7f.jpg)