Are you tired of generalist newsletters that skim over your real challenges? Dastra offers you Dastra Insights, a legal and regulatory watch specifically designed for DPOs, lawyers and Privacy and AI professionals.
🎯 A targeted, useful watch grounded in the real-world reality of data protection and AI.
Here is our selection for March 2026:
Scraping of professional data: closure of the CNIL procedure against Kaspr
The CNIL announced the closure of the procedure initiated against Kaspr, a company specialising in contact data extraction.
As a reminder, Kaspr offered a tool for retrieving professional information (emails, phone numbers) from LinkedIn profiles, often without the knowledge of the individuals concerned.
In 2023, the CNIL had issued a sanction against the company, notably for:
- collection of data without a valid legal basis,
- failure to inform individuals,
- inadequate retention periods.
It also emphasised that the individuals concerned could not reasonably have expected such reuse of their LinkedIn data, which further undermined any reliance on legitimate interest.
This decision confirms the CNIL's position on scraping and data enrichment tools: the fact that data is accessible online is not sufficient to justify its reuse.
The closure of the procedure means that Kaspr has implemented the corrective measures expected by the supervisory authority.
Digital health services: a financial sanctions regime clarified by the decree of 3 March 2026
Decree No. 2026-153 of 3 March 2026 supplements the framework applicable to digital health services by introducing a specific financial administrative sanctions regime.
This new regime does not replace the sanctions provided for under the GDPR, but adds to them. Actors may therefore be exposed to sanctions issued by the CNIL under the GDPR, as well as specific administrative sanctions under health law.
A strengthened administrative sanctioning power
The decree specifies the conditions under which the competent administrative authority may impose financial sanctions on actors failing to comply with the obligations applicable to digital health services.
Those concerned include in particular publishers of digital health services, as well as all actors involved in the processing of health data or in the provision of such services.
Clearly identified breaches
The decree frames the types of breaches liable to give rise to sanctions, in particular:
- failure to comply with data security and confidentiality requirements,
- use of solutions that do not comply with health data hosting (HDS) requirements,
- failure to comply with the reference frameworks applicable to digital health services.
This clarification contributes to legally securing controls, while reducing the margins of interpretation for actors.
This decree calls for increased vigilance regarding the compliance of the solutions used, particularly with regard to HDS hosting, as well as alignment with applicable sectoral reference frameworks.
It provides that administrative sanctions may reach €300,000, an amount that may be raised to €1 million in the event of a repeat offence.
Targeted advertising: the Conseil d'État confirms the €40M sanction against Criteo
By a decision of 4 March 2026, the Conseil d'État rejected Criteo's appeal against the sanction issued by the CNIL in 2023 in the area of personalised advertising. This decision definitively validates the €40 million fine imposed on the French adtech company.
As a reminder, the CNIL had identified five GDPR breaches:
- the inability of Criteo to demonstrate valid consent,
- shortcomings in informing individuals,
- failure to comply with the right of access,
- breaches related to the withdrawal of consent and erasure,
- as well as insufficiency of the agreement governing joint controllership with its partners.
The Conseil d'État's decision goes beyond a simple rejection:
It first confirms that an actor such as Criteo may be qualified as a joint controller for the placement of cookies on partner sites, and then as a data controller for the subsequent use of the collected data for targeted advertising purposes. It also recalls that pseudonymised data remains personal data as long as individuals remain identifiable without disproportionate effort.
Crucially, an actor cannot hide behind its partners to escape its obligation to prove the validity of consent.
The Conseil d'État recalls that in the presence of multi-actor processing, the contractual distribution of roles is not sufficient: this distribution must be complete, consistent with operational reality, and must effectively enable compliance with GDPR obligations.
Cybersecurity: joint opinion of the EDPB and EDPS on the revision of the Cybersecurity Act
The European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS) published a joint opinion on the proposals to revise the Cybersecurity Act.
This opinion forms part of the strengthening of the European cybersecurity certification framework, in particular for digital services and critical infrastructures.
A strengthened articulation between cybersecurity and data protection
Both authorities stress the need to ensure coherence between cybersecurity objectives and GDPR requirements.
They recall that certification schemes must not be limited to technical considerations, but must fully integrate data protection principles, in particular minimisation, security of processing and privacy by design.
⚠️ Certification alone cannot constitute proof of GDPR compliance.
They call for a clarification of roles between the various authorities involved, in order to avoid overlapping competences, and in particular stress the need to involve data protection authorities in the definition and monitoring of certification schemes, especially when these involve the processing of personal data.
A focus on transfers and systemic risks
The opinion highlights the issues related to data transfers, in particular in the context of critical digital services. The authorities recommend integrating these risks into certification schemes, notably by taking into account potential access by entities located outside the European Union.
More broadly, they call for a risk-based approach, enabling the identification of potential impacts on the rights and freedoms of individuals. Concretely, the opinion stresses that certification cannot be limited to purely technical criteria (robustness, availability, resilience), but must also take into account the nature of the data processed, the purposes pursued and the contexts of use of the systems.
This notably implies identifying, upstream, the risks for the individuals concerned.
Luxembourg: annulment of the €746 million fine against Amazon by the administrative court
By a ruling handed down in March 2026, the Luxembourg Administrative Court confirmed the annulment of the €746 million fine issued in 2021 against Amazon by the National Commission for Data Protection (CNPD), in the context of data processing for targeted advertising purposes.
This sanction, the highest ever imposed in Europe on the basis of the GDPR, was primarily based on breaches relating to the legal basis of processing and transparency obligations, in connection with the use of data for advertising personalisation purposes.
A strict assessment of the burden of proof
The dispute concerned the legal qualification of the processing operations carried out by Amazon in the context of behavioural advertising. The CNPD had considered that these processing operations did not rest on a valid legal basis, in particular in the absence of consent compliant with GDPR requirements.
However, applying case law from the Court of Justice of the European Union (CJEU) following two judgments of 5 December 2023 (cases "DEUTSCHE WOHNEN" and "NACIONALINIS", C-807/21 and C-683/21), it falls to the supervisory authority to analyse the existence of a fault, which may consist either of a deliberate act or of negligence on the part of the data controller with respect to the identified GDPR violations.
The Administrative Court confirms the analysis of the Administrative Tribunal in finding that the violations alleged by the CNPD were not sufficiently established.
The ruling highlights the high standards placed on the supervisory authority, which must be able to demonstrate, in a precise and well-supported manner, the characterisation of the alleged breaches, as well as the requirements of Article 83, which presupposes rigorous reasoning in administrative sanctions.
Indeed, a second analysis had not been properly carried out by the CNPD in light of the principles established by the CJEU, namely the choice of the most appropriate measure from the wide range of those provided for under the applicable regulation.
The Court therefore called into question the CNPD's analysis, highlighting the shortcomings in the demonstration of the violations of the applicable regulation.
Germany: the Government is preparing a text to address gaps in the repression of pornographic deepfakes
Germany announced the preparation of a draft law on digital violence targeting in particular pornographic deepfakes. At this stage, no consolidated text has yet been published, but the federal government confirmed its intention to address the criminal law gaps concerning pornographic deepfakes and, more broadly, sexualised image-based abuse. The announced framework is also intended to facilitate the identification of perpetrators, access to information on account holders and, in some cases, enable the blocking of accounts by judicial order.
This announcement comes at a time when the European framework remains imperfect. The AI Act does provide for transparency obligations for artificially generated or manipulated content, including deepfakes, but it is not sufficient, on its own, to address the specific case of non-consensual sexual content. This is precisely why the European Parliament supported, in March 2026, an amendment to the text in order to prohibit so-called "nudification" systems, capable of producing sexually explicit or intimate images of a real, identifiable person without their consent.
This development is interesting on two levels.
- First, it highlights a persistent gap between technological development and the existing legal framework. While victims of digital violence and deepfakes are not without protection, criminal law appears still insufficiently adapted, in particular to address these new uses and effectively sanction their perpetrators.
- Second, it sheds light on broader gaps in the effectiveness of legal remedies. Beyond the criminal dimension, civil law mechanisms already exist, notably regarding infringement of personality rights, with the possibility of obtaining cessation of the harm and compensation. In practice, however, their implementation remains limited, notably due to the difficulties in identifying perpetrators.
The German draft law aims precisely to remove these obstacles, by facilitating access to the information needed to take action and, more broadly, the exercise of rights by the individuals concerned.
