Javascript is required
logo-dastralogo-dastra

DastraNews: what happened in Privacy & AI in September?

DastraNews: what happened in Privacy & AI in September?
Leïla Sayssa
Leïla Sayssa
6 October 2025·13 minutes read time

Tired of general newsletters that skim over your real concerns? DastraNews, offers legal and regulatory monitoring specifically designed for DPOs, lawyers, and privacy professionals.

Each month, we go beyond a simple recap: we select about ten decisions, news, or positions that have a concrete impact on your missions and organizations.

🎯 Targeted, useful monitoring grounded in the real-world realities of data protection and AI.

Here is our selection for August 2025


EU consultation on AI Transparency Guidelines

The European Commission launched a public consultation (deadline: 9 October 2025) to develop guidelines and a code of practice for transparent AI systems, drawing from the transparency provisions in the AI Act.

Under the AI Act, deployers and providers of generative AI, emotion recognition, biometric categorization, and manipulated content systems must disclose to users when they interact with AI, or when content is AI-generated or manipulated.

The consultation invites AI developers, public bodies, research groups, civil society, and citizens to contribute views. The transparency obligations are set to apply starting 2 August 2026.

Implications for practitioners:

  • Organizations must prepare to embed transparency disclosures (AI labels, metadata, explanation) in their systems.

  • Participating in the consultation is a chance to influence how the obligations will be defined and enforced.

  • The upcoming code of practice may become a de facto standard or a benchmark in audits, litigation, and enforcement.


EU-U.S data flows: DPF validated by General Court

On 3 September 2025, the General Court of the European Union dismissed the annulment action brought by MP Philippe Latombe against the EU–U.S. Data Privacy Framework (DPF).
Key findings of the Court:

  • The Court chose to rule on the merits, bypassing debates about Latombe’s standing.

  • The decision validates that the DPF provides a level of protection “essentially equivalent” to EU standards, aligning with the principle in GDPR Article 45.

  • The Data Protection Review Court (DPRC) as set up under DPF was judged sufficiently independent to satisfy redress requirements.

  • Regarding bulk collection by U.S. intelligence, the Court recognized that ex post review and procedural safeguards under Executive Order 14086 are consistent with EU jurisprudence on surveillance (Schrems II).

What this means for data controllers and processors:

  • The DPF remains a valid mechanism for EU → U.S. transfers under GDPR’s Chapter V.

  • Nevertheless, organizations should maintain fallback strategies: SCCs/BCRs, detailed Transfer Impact Assessments (TIAs), especially for sensitive flows.

  • Monitor U.S. legal and oversight changes: the Court emphasized that the Commission must continuously evaluate whether U.S. practices remain aligned with the framework.

  • An appeal to the CJEU is possible.

👉 For more information, read our article here.


CJEU ruling: pseudonymized data is not always personal

On 4 September 2025, in case C-413/23 (EDPS v SRB), the Court of Justice of the European Union clarified that pseudonymized data may, in certain circumstances, be considered as non-personal by recipients.

Key clarifications from the judgment:

  • The Court emphasized that the recipient’s perspective matters: if the recipient cannot reasonably re-identify data subjects, pseudonymized data may fall outside the scope of personal data.

  • Identifiability must be assessed based on realistic technical and organizational means, not theoretical possibility.

  • The controller's obligations (transparency, information to data subjects at time of collection) still apply regardless of how data may later appear to recipients.

  • This marks a shift from the “absolute” view (supported by EDPS/EDPB) toward a contextual, risk-based approach to pseudonymization.

Practical takeaways for DPOs & legal teams:

  • Separate the identifying information (keys) and limit access to reinforce non-identifiability.

  • When sharing or transferring data, perform a recipient-level assessment: what means do they have to re-identify?

👉 For more information, read our article here.


On 3 September 2025, the CNIL announced significant sanctions: Google received €325 million, SHEIN €150 million for breaches of e-privacy rules concerning cookies and trackers. These fines are part of CNIL’s long-term action plan launched in 2019 to enforce stricter compliance.

Key violations include:

  • Using trackers without valid prior consent (violating consent and information rules)

  • “Cookie walls” (requiring acceptance of trackers for access) deemed acceptable only if the user has a real choice and alternatives are balanced and equally accessible.

  • Google was also held in breach for sending advertising emails based on user data within Gmail without proper consent (violating the CPCE, art. L. 34-5)

Takeaways for operators & marketers:

  • Cookie compliance: regulatory attention remains high.

  • Avoid dark patterns, forced consent, or opaque cookie walls.

  • Review your consent, logging, audit trail, and cookie walls carefully. Documentation is key if challenged.


Brazil: adequacy decision in the works

The European Commission has released a preliminary adequacy decision recognizing that Brazil’s data protection framework (LGPD) ensures a level of protection equivalent to EU standards. Once finalized, this will allow free and secure data flows between the EU and Brazil without extra safeguards.

On the Brazilian side, the ANPD is finalizing its own adequacy process to recognize EU law as equivalent. The mutual recognition will strengthen citizens’ rights, increase legal certainty, simplify international business operations, and boost trade competitiveness.

The process now moves to the European Data Protection Board for an opinion, followed by approval from EU member states. If adopted, Brazil will join 16 other jurisdictions (including the UK, Canada, Japan, and South Korea) already deemed adequate.

EDPB issues first guidelines on the interplay between the DSA & GDPR

The European Data Protection Board (EDPB) has published its first guidelines clarifying the relationship between the Digital Services Act (DSA) and the GDPR. The DSA, which governs online platforms and search engines, aims to create a safer digital environment and safeguard fundamental rights. Many of its obligations involve processing personal data, raising overlaps with the GDPR.

Key Takeaways from the Guidelines

  • No hierarchy of laws: The DSA imposes obligations on platforms (e.g., content moderation, diligence duties, algorithmic transparency), but these do not override or replace GDPR obligations. Data protection rules remain fully applicable.

  • Legal basis & purposes: Any data processing carried out under the DSA must still rely on a GDPR-compliant legal basis (such as consent or legitimate interest). The DSA does not create a new automatic ground for processing personal data.

  • Shared responsibilities: Roles between platforms, hosting providers, intermediaries, and other actors must be clearly defined to determine who is the data controller or processor in different scenarios (moderation, recommender systems, profiling, etc.).

  • Transparency & information duties: The DSA’s transparency requirements (explaining algorithms, moderation criteria, reporting obligations) must be coordinated with GDPR information rights (purpose, access, retention).

  • The EDPB highlights the need for closer cooperation between Digital Services Coordinators, the European Commission, and Data Protection Authorities to ensure legal certainty for companies and stronger protection of users’ rights.

Why it matters

Companies cannot assume compliance with one regime ensures compliance with the other. Instead, both must be reconciled through consistent governance, risk management, and user communication. This dual compliance challenge raises the stakes for platforms, which face oversight not only from data protection authorities but also from digital services regulators.

CNIL and Inria strengthen partnership on data protection and algorithm evaluation

The CNIL (French Data Protection Authority) and Inria (French national institute for research in digital science) have signed a renewed cooperation agreement to deepen joint efforts in data protection, privacy and algorithmic evaluation.

The partnership builds upon over ten years of collaboration, but now aligns more closely with the evolving regulatory and technological challenges in Europe, notably those posed by artificial intelligence.

Together, CNIL and Inria will coordinate research, produce shared tools and guidance, organize training and public outreach, and co-supervise doctoral and postdoctoral work.

One focus will be the new Institut national pour l’évaluation et la sécurité de l’intelligence artificielle (INESIA), with partners such as ANSSI and others.

Commission proposes guidance and template for serious AI incidents and launches consultation

The European Commission has published a draft guidance document and a reporting template for serious incidents involving AI systems under the AI Act, and is seeking input through a public consultation.

The draft guidance clarifies when an event should be reported as a “serious incident,” what information must be included, and how the reporting process should operate. The template aims to harmonize the format and level of detail across Member States to ensure consistent incident handling and oversight.

The initiative underscores the Commission’s effort to give operational clarity to the AI Act’s obligations on incident reporting and monitoring. The consultation is open until a specified deadline.

Austrian court clarifies Article 22 GDPR in AMAS algorithm case

The Austrian Federal Administrative Court has overturned a ban on the AMAS algorithm, used by Austria’s public employment service (AMS) to assess jobseekers’ labour market prospects.

The court found a valid legal basis in the Labour Market Service Act, which specifies what data can be processed and for what purposes, thereby meeting GDPR Articles 6 and 9 requirements.

Crucially, the judges held that AMAS did not amount to prohibited automated decision-making under Article 22 GDPR:

  • The court drew a clear distinction from the CJEU’s SCHUFA judgment (C-634/21). It acknowledged that AMAS carries out profiling and that its categorisation amounts to a form of “decision.”
  • However, Article 22 applies only where decisions are made solely by automated means and have legal or similarly significant effects.
  • In this case, AMS counsellors played a substantive role rather than a purely formal one. They were required to review and discuss the algorithmic output with jobseekers, take into account additional personal circumstances, correct the algorithm’s outcome where necessary, and ultimately make the final classification themselves.

The Data Act goes live

On 12 September 2025, the EU’s Data Act (Regulation 2023/2854) transitions from a future framework into reality. While it had entered into force in January 2024, the regulatory obligations now become operational and enforceable.

Under the Data Act, any data generated by a connected product or a related service must be accessible by the user. Access must be timely, free of charge, in structured, machine-readable formats, and in real time when technically feasible.

The new regulation also embeds obligations on contractual transparency: sellers or lessors of connected products must inform users — prior to contract — of what data will be generated, how it will be stored, and when and how it can be accessed. In B2B and B2G contexts, data sharing must occur on fair, reasonable, and non-discriminatory terms.

Moreover, the Data Act includes rules to promote competition via cloud switching: providers must remove technical and contractual barriers. From 12 January 2027, migration fees will no longer be allowed, and before then any fees must be limited to the provider’s internal costs.

👉 For more information, read our article here.

CNIL fines La Samaritaine for hidden cameras in employee Areas

On 18 September 2025, the CNIL imposed a €100,000 fine on Samaritaine SAS for having hidden surveillance cameras in two storage rooms of its store. The cameras had the appearance of smoke detectors and recorded audio, effectively spying on staff. The device was installed in August 2023, discovered by employees, then removed in September 2023.

Key findings and violations

  • Samaritaine failed to carry out any prior analysis of GDPR compatibility or document the exceptional nature of hidden cameras.

  • The cameras were not declared in the processing register or impact assessments, and the DPO was informed only after installation.

  • The audio recording was deemed excessive, breaching the principle of data minimization (article 5.1(c) GDPR).

  • The CNIL stressed that hidden camera use in the workplace is only permissible under strict conditions: temporary use, transparency when possible, prior justification, documented safeguards, and respect for employee privacy.

Lessons for businesses

This sanction underscores that even in difficult contexts (e.g., combating theft), surveillance must be tightly justified, documented, and proportionate. Hidden monitoring without due process or employee notice risks serious regulatory consequences.

👉 For more information, read the deliberation here.

Disney to pay $10M to settle FTC allegations over children’s data

Disney has agreed to a $10 million settlement with the U.S. Federal Trade Commission (FTC) over claims that it enabled unlawful collection of children’s personal data.

The FTC alleged violations of the Children’s Online Privacy Protection Act (COPPA) regarding data collected in Disney’s Kids Mode mobile apps.

According to the complaint, Disney collected persistent identifiers and other personal data from children without valid consent and used them for behavioral advertising. The settlement requires Disney to delete the improperly collected data, maintain a compliance program, and submit to third-party privacy audits for the next 20 years.


About the author
Subscribe to our newsletter

We'll send you occasional emails to keep you informed about our latest news and updates to our solution

* You can unsubscribe at any time using the link provided in each newsletter.