Javascript is required
logo-dastralogo-dastra

Dastra Insights: What happened in Privacy & AI in November?

Dastra Insights: What happened in Privacy & AI in November?
Leïla Sayssa
Leïla Sayssa
2 December 2025·13 minutes read time

Tired of general newsletters that skim over your real concerns? Dastra Insights, offers legal and regulatory monitoring specifically designed for DPOs, lawyers, and privacy professionals.

Each month, we go beyond a simple recap: we select about ten decisions, news, or positions that have a concrete impact on your missions and organizations.

🎯 Targeted, useful monitoring grounded in the real-world realities of data protection and AI.

Here is our selection for November 2025:

Omnibus Project: the Commission paves the way for major simplification of the European Digital Framework

On November 19, 2025, the European Commission unveiled its initiative known as the "Digital Omnibus," a comprehensive set of measures aimed at simplifying and modernizing the regulatory framework governing digital activities in Europe, from the GDPR to the AI Act, including the ePrivacy Directive and cybersecurity rules.

The goal: to enhance competitiveness, reduce administrative complexity, and provide the clarifications awaited by professionals.

Among the proposed adjustments:

  • Postponement of obligations under the AI Act for high-risk systems (now scheduled for December 2027);

  • Targeted modifications of the GDPR;

  • Simplification of cookie consent;

  • Creation of a one-stop shop to centralize notifications of cybersecurity incidents.

This set of measures still needs to be reviewed by the European Parliament and the Council, but it already marks a turning point in the evolution of European digital law.

For more information, click here.

DMA and GDPR: Publication of Joint Guidelines by the Commission and EDPB

The EDPB and the European Commission have published their first joint guidelines to clarify the interplay between the Digital Markets Act (DMA) and the GDPR.

Objective: to simplify compliance, enhance application consistency, and provide more legal certainty to gatekeepers, user companies, beneficiaries, and citizens.

The text reminds that the GDPR and DMA pursue complementary objectives:

  • the GDPR protects the rights and freedoms of individuals,

  • the DMA aims for fairness and contestability in digital markets.

The guidelines specifically explain:

  • how gatekeepers must apply the requirements for specific choice and valid consent (Art. 5(2) DMA) when they wish to combine or reuse personal data;

  • how to implement, in compliance with the GDPR, the obligations related to data portability, access requests, distribution of third-party applications, and the interoperability of messaging services.

📅 A public consultation is open until December 4, 2025. Contributions will be published, and the final text will be jointly adopted by the EDPB and the Commission.

Access the draft guidelines here.


EDPB: Stakeholder Consultation on Anonymization and Pseudonymization

The EDPB is organizing a multi-stakeholder event dedicated to anonymization and pseudonymization methods, as several recent decisions (SRB vs EDPS, Meta) have undermined some common practices.

The event aims to:

  • clarify the expectations of authorities,

  • identify genuinely reliable techniques,

  • prepare the update of the EDPB guidelines.

Authorities recognize the current tension between innovation, AI, and GDPR compliance, and seek to harmonize national approaches.

Access here for the EDPB press release.

Adequacy Decision Brazil: EDPB Adopts Its Opinion

On November 4, 2025, the EDPB unanimously adopted its opinion on the draft adequacy decision presented by the European Commission for Brazil, in accordance with Article 45 of the GDPR.

If adopted, this decision will allow European organizations to transfer personal data to Brazil without additional guarantees, as is the case with "adequate" countries.

The EDPB highlights:

  • the strong convergence between Brazilian law LGPD and the GDPR,

  • the consistency with the jurisprudence of the CJEU,

  • and the overall effectiveness of the guarantees provided under Brazilian law.

However, the Committee invites the Commission to clarify and monitor certain elements: the obligation to conduct Impact Assessments (AIPD); possible limits to transparency, related to commercial and industrial secrecy, and rules governing subsequent transfers.

The EDPB also notes that Brazilian law does not fully apply to processing carried out by public authorities for purposes of national security or criminal prosecution, but welcomes the fact that partial application is provided for criminal investigations and law enforcement, in accordance with Brazilian jurisprudence.

The draft is now to be examined by the Committee of Member States before final adoption.

Fore more information, click here.

United Kingdom: ICO Consults on Its New Investigation and Enforcement Approach

The ICO has opened a public consultation on its future guidance governing the procedures it applies when it suspects a violation of the UK GDPR or the Data Protection Act 2018.

The text also specifies how the ICO will use its new powers granted by the Data (Use and Access) Act 2025, allowing it to demand responses and reports from organizations.

The draft notably addresses:

  • Criteria for initiating an investigation;

  • The course of an investigation;

  • Possible decisions (warning, reprimand, fine, injunction);

  • Conditions for reaching an amicable settlement with reduced penalties.

📅 Consultation open until January 23, 2026, accessible via this link.

In its ruling Inteligo Media SA of November 13, 2025, the CJEU provides a significant clarification regarding direct marketing via email.

Until now, organizations systematically associated these practices with a GDPR legal basis (Article 6), generally either consent or legitimate interest.

The Court now rules that Article 13(2) of the ePrivacy Directive (2002/58) — transposed in France to L.34-5 CPCE — constitutes in itself a sufficient legal basis to regulate electronic marketing, without the need to refer to Article 6 of the GDPR.

In practice:

  • Consent remains valid and usable;

  • However, data controllers will need to explicitly explain that the legal basis derives from Article 13(2) ePrivacy (and its national transposition);

  • An update of registers, notices, and marketing policies may be required to reflect this clarification.

A decision likely to lead to practical adjustments in marketing compliance efforts.

Access the ruling here.


Capita: Data Breach Affecting 6 Million People

The ICO has imposed a fine of £14 million on Capita, following a cyberattack in 2023 that resulted in the theft of data from 6.6 million people (pension records, HR data, customer information, and sometimes sensitive or financial data).

The investigation concludes several serious shortcomings:

  • Lack of sufficient technical and organizational measures, leaving systems vulnerable.

  • Failure to manage security alerts: a critical report was ignored for 58 hours, allowing the attacker to gain administrator permissions and exfiltrate nearly one terabyte of data.

  • Lack of penetration testing: some systems had only been tested upon commissioning.

  • Failure to prevent lateral movements in the network, despite several prior internal alerts.

The ICO reminds that cybersecurity is an essential element of digital trust and that no organization, even large ones, is exempt from its obligations.

For your information, initially, the authority considered a fine of £45M, which was reduced after corrective measures were taken, Capita’s cooperation, and an amicable settlement. Capita acknowledged its responsibility and accepted the sanction.

Click here for more information.


The CNIL has sanctioned Condé Nast Publications (publisher of Vanity Fair) with a €750,000 fine for several ongoing violations of Article 82 of the French Data Protection Act regarding cookies and trackers.

After an initial complaint filed in 2019 by the association NOYB, a formal notice in 2021, and a closure in 2022, the CNIL conducted new inspections in 2023 and 2025 showing that the website vanityfair.fr was still non-compliant.

The main violations identified:

  • Placement of cookies subject to consent before any choice: cookies were installed upon arrival on the site, without prior consent.

  • Insufficient information: some cookies were presented as "strictly necessary" without clear explanation of their actual purposes.

  • Impossibility to effectively refuse or withdraw consent: even after clicking "Reject All" or withdrawing consent, cookies subject to consent continued to be placed or read.

The sanction takes into account: the existence of a previous formal notice, the number of affected users, and the repetition of the violations despite exchanges with the CNIL.

A strong reminder to publishers: cookie compliance requires effective implementation, continuous monitoring, and genuinely functional refusal mechanisms.

For more information, click here.

CNIL: Launch of the 2025 National Survey "DPO and AI"

The CNIL is launching a large-scale survey aimed at better understanding the role of DPOs in the context of AI usage in French organizations.

The rise of AI is disrupting data processing practices. However, the CNIL seeks to answer the following questions:

  • What role does the DPO play in this new reality? How far does their scope of intervention extend?
  • What challenges do they face daily?
  • What tools and training do they need?

To answer these questions, the CNIL opens a national survey targeting DPOs, whether they operate in the public or private sector.

Its position is as follows: while the DPO must be associated with any use of personal data, they are not necessarily the lead on compliance with the Artificial Intelligence Regulation (RIA), which involves other specific competencies.

However, the RIA presents a major opportunity: its principles (risk-based approach, responsibility, transparency, and protection of fundamental rights) build upon those of the GDPR. DPOs thus have a strong foundation to guide their organizations toward compliant and responsible AI.

🔗 DPOs, your opinion matters: participate in the survey before December 15, 2025 here


Hungary: Adoption of a National AI Law

Hungary becomes one of the first member states to adopt a national law on artificial intelligence, complementing the AI Act.

The law introduces:

  • sector-specific obligations,

  • strengthened transparency requirements,

  • new administrative penalties.
    It illustrates a growing trend towards complementary nationalization of AI rules in Europe.

Access the law here.

AI and CJEU: A New Preliminary Question Before the CJEU

The CJEU is seized with a new case (C-245/25) regarding the use, by an expert witness, of AI-assisted accident simulation software. The central question: does this type of tool fall under high-risk AI systems as defined by the AI Act?

The case arises from a road accident. The designated expert used the software Virtual Crash 4.0, designed for the American market, which one party contests. They argue that the tool wasn't suitable and that an expert report must detail all the calculations conducted, which the expert couldn't provide, merely claiming to have "verified" the results.

The Bulgarian court then decided to refer to the CJEU, invoking notably:

  • the principle of explainability,

  • the requirement for understanding decision-making processes,

  • the potential impact on the right to a fair trial (Art. 19 TEU).

The questions posed concern:

  • the qualification of the software as high-risk AI (Annex III, point 8, AI Act),

  • the scope of Article 86 AI Act regarding the right to an explanation,

  • the possibility for a court to rely on a report based solely on an algorithmic result simply "validated" by the expert.

It remains to be seen whether the CJEU will address all these questions… More details here!

The European Commission Launches a Code of Practice on Marking AI-Generated Content

The European Commission has announced the creation of a working group tasked with developing a code of practice for the marking and labeling of AI-generated content.
Objective: to ensure a clear identification of synthetic content, including deepfakes, to enhance the transparency mandated by the AI Act.

The code will cover:

  • marking formats (watermarks, metadata, visual warnings),

  • obligations for generative AI providers,

  • technical measures ensuring the reliability of marking.

This future code, to be developed over a period of seven months by independent experts under the coordination of the EU AI Office, will be a voluntary tool for providers and users of generative AI systems.

A first version is expected in early 2026. The Commission's press release is accessible here.


Data Act: Draft Model Contract Clauses

The European Commission has published the recommendation on new non-binding model contractual clauses intended to facilitate the implementation of the Data Act, particularly for SMEs.

These models, freely usable and adaptable, aim to assist organizations in structuring their data-sharing and cloud service contracts.

Model Contractual Terms (MCT) – Mandatory Data Sharing

Three sets of MCT have been developed to cover relationships where data sharing is mandatory (chapters II and III of the Data Act):

  1. MCT Holder ↔ User: access, use, and sharing of data generated by a connected product or associated service.

  2. MCT User ↔ Recipient: conditions of use of data by the recipient chosen by the user.

  3. MCT Holder ↔ Recipient: terms of sharing and possible financial compensation.

A fourth model addresses voluntary data sharing between businesses, in compliance with the rules of Chapter IV on abusive clauses.

Standard Contractual Clauses (SCC) – Cloud Services (Chapter VI)

Three SCC translate the obligations related to switching between providers:

  • Switching & Exit: conditions for changing service provider.

  • Termination: applicable provisions at the end of the contract.

  • Security & Business Continuity: obligations in case of incidents and guarantees during migration.

The next step is to translate and publish the MCT and SCC in all EU languages, which should take three to four months.

Access here for the Commission's press release.


About the author
Subscribe to our newsletter

We'll send you occasional emails to keep you informed about our latest news and updates to our solution

* You can unsubscribe at any time using the link provided in each newsletter.