[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"article_59682":3},{"tableOfContents":4,"markDownContent":5,"htmlContent":6,"metaTitle":7,"metaDescription":8,"wordCount":9,"readTime":10,"title":7,"nbDownloads":11,"excerpt":12,"lang":13,"url":14,"intro":8,"featured":15,"state":16,"author":17,"authorId":18,"datePublication":22,"dateCreation":23,"dateUpdate":24,"mainCategory":25,"categories":41,"metaDatas":47,"imageUrl":48,"imageThumbUrls":49,"id":57},true,"Tired of general newsletters that skim over your real concerns? **DastraNews,** offers legal and regulatory monitoring **specifically designed for DPOs, lawyers, and privacy professionals**.\r\n\r\nEach month, we go beyond a simple recap: we select about ten decisions, news, or positions **that have a concrete impact on your missions and organizations**.\r\n\r\n🎯 **Targeted, useful monitoring grounded in the real-world realities of data protection and AI.**\r\n\r\nHere is our selection for **October 2025:**\r\n\r\n## EDPB’s 2026 coordinated enforcement topic: transparency obligations\r\n\r\nOn **14 October 2025**, the **European Data Protection Board (EDPB)** announced that its next Coordinated Enforcement Framework (the fifth) will focus on **transparency and information obligations** under **Articles 12, 13, and 14 of the GDPR**.\r\n\r\nIn each coordinated action, the **EDPB selects a common priority topic** for national **Data Protection Authorities (DPAs)** to investigate.\r\n\r\nIn 2026, supervisory authorities across Europe will jointly assess how controllers and processors comply with their **duty to inform individuals** when their data is processed.\r\n\r\n**Why it matters:**\r\n\r\n- Transparency is a **core principle** of the GDPR: without clear information, individuals cannot effectively exercise their rights.\r\n\r\n- The outcomes of these national investigations are then **consolidated and analysed** to provide a deeper, EU-wide understanding of the issue, enabling **targeted follow-up and enforcement** at both national and European levels.\r\n\r\n- The initiative is expected to start in **2026**, giving organizations some time (but not much) to prepare.\r\n\r\n> 🔗 For more information, [click here.](https://www.edpb.europa.eu/news/news/2025/coordinated-enforcement-framework-edpb-selects-topic-2026_en)\r\n\r\n## EDPB adopts opinions recommending UK adequacy extension\r\n\r\nDuring its latest plenary, the European Data Protection Board (EDPB) adopted two opinions on the European Commission’s **draft decisions to extend the validity of the UK adequacy decisions**, under both the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED)**, until December 2031.**This would allow EU organisations and authorities to **continue transferring personal data to the UK without additional safeguards.**So, the EDPB notes that most UK legal updates aim to clarify or facilitate compliance but flags areas requiring closer monitoring by the European Commission:\r\n\r\n- Onward transfers: The UK’s new adequacy test (Data Use and Access Act 2025) lacks references to crucial safeguards like government access, individual redress, and independent supervision.\r\n- Encryption concerns: Technical Capability Notices (TCNs) allowing circumvention of encryption could create systemic vulnerabilities.\r\n- ICO restructuring: The new Information Commission model should be monitored for independence and enforcement capacity, though its transparency policy is welcomed.\r\n\r\n🎯**The UK remains an adequate destination for EU data transfers until 2031, but only under strict, ongoing EU monitoring.** Good news for organizations transferring personal data between the EU and the UK, but vigilance remains key.\r\n\r\n> 🔗 For more information, [click here.](https://www.edpb.europa.eu/news/news/2025/draft-uk-adequacy-decisions-edpb-adopts-opinions_en)\r\n\r\n## Public consultation on the draft joint guidelines covering the interplay between the Digital Markets Act (DMA) & the GDPR\r\n\r\nAs of 9 October 2025, the European Commission and the European Data Protection Board (EDPB) have launched a public consultation inviting comments on draft **joint guidelines** clarifying how the DMA and GDPR interact.\r\n\r\nThe DMA targets large digital platforms (gatekeepers) and imposes obligations that often trigger GDPR processing. These guidelines aim to align both regulatory regimes.\r\n\r\nThese guidelines are designed to help “gatekeepers” under the DMA understand and meet their GDPR-compliance obligations, especially where the DMA mandates data processing operations, such as combining user data, portability, or distribution of third-party apps.\r\n\r\nThe consultation closes on **4 December 2025**, with the final guidelines expected to be adopted in 2026.\r\n\r\n> 🔗 Read the [draft guidelines here.](https://digital-markets-act.ec.europa.eu/document/download/8ba0913f-2778-4a6d-9c58-10f8c7ead009_en?filename=Joint_COM-EDPB_GLS_interplay_DMA_GDPR_for_public_consultation.pdf)\r\n>\r\n> 🔗 For more information, [click here.](https://digital-markets-act.ec.europa.eu/consultation-joint-guidelines-interplay-between-dma-and-gdpr_en?utm_campaign=DPIA&ut)\r\n\r\n## Experian (US credit and data trader) fined €2.7 million for GDPR breaches\r\n\r\nThe Dutch data protection authority (AP) has imposed a fine of **€2.7 million** on Experian Nederland B.V., a US credit and data trader, for violations of the GDPR.\r\n\r\nKey findings:\r\n\r\n- Experian collected and processed extensive personal and sensitive data (including from energy, telecoms and public registers) to build credit reports, often without sufficient transparency to the individuals concerned. The data included negative payment behavior, outstanding debts, and bankruptcy information used to generate credit assessments supplied to service providers and sellers.\r\n\r\n- Complaints from consumers who faced higher deposits or were denied services prompted the AP investigation.\r\n\r\n- The company failed to properly inform data subjects of its processing operations (violations of Articles 12 & 14) and relied on the legal basis of “legitimate interests” without demonstrating necessity or balance in favour of individual rights.\r\n\r\n- Experian has ceased its consumer credit rating operations in the Netherlands as of January 2025 and committed to deleting the related database by the end of this year.\r\n\r\n> 🔗 For more information, [click here.](https://www.autoriteitpersoonsgegevens.nl/actueel/experian-krijgt-boete-van-27-miljoen-euro-voor-privacyovertredingen)\r\n\r\n## The EU AI Act enters its operational phase: European Commission launches official information platform\r\n\r\nOn **8 October 2025**, the **European Commission** launched the **AI Act Single Information Platform**, marking the start of the EU AI Act’s operational phase. This **one-stop platform** is designed to help both public and private organisations **understand and implement** the regulation effectively.\r\n\r\n#### 🔍 Why it matters\r\n\r\nSince its **partial entry into force on 1 August 2024**, the AI Act has been reshaping how AI is developed, deployed, and governed across the EU. Yet many organisations still struggle with one key question: *where to start?*\r\n\r\nThe new platform aims to provide clarity and practical tools:\r\n\r\n- **AI Act Explorer**: an interactive interface to browse the regulation and annexes;\r\n\r\n- **Compliance Checker**: a self-assessment tool to identify applicable obligations;\r\n\r\n- **National Resources Hub**: to access local initiatives and competent authorities;\r\n\r\n- **Service Desk**: direct expert support from the **European AI Office**.\r\n\r\nIt also centralises the **official FAQs** and guidance managed by the **AI Act Service Desk**, offering the first unified reference to distinguish between immediate and future obligations and to encourage experimentation in compliant environments.\r\n\r\n> 🔗 **Access the platform:** [AI Act Single Information Platform](https://ai-act-service-desk.ec.europa.eu/en)\r\n>\r\n> 🔗 **Access the FAQ:** [List of FAQs](https://ai-act-service-desk.ec.europa.eu/en/faq)\r\n\r\n## Italy’s new AI law: first of its kind in the EU\r\n\r\nOn **23 September 2025**, Italy’s Parliament approved Law No. 132/2025 (initially Bill 1146-B) regulating artificial intelligence, which will enter into force on **10 October 2025**.\r\n\r\nItaly thereby becomes the **first EU Member State** to adopt a comprehensive national AI law. A central aspect is its explicit alignment with the EU AI Act. The government is expected to adopt decrees aimed at harmonising national law with the EU regulation within twelve months.\r\n\r\n#### 🔍 What it introduces\r\n\r\n- Reinforces a **human-centred approach**: AI systems must respect fundamental rights, transparency, security, data protection, non-discrimination, gender equality and sustainability.\r\n\r\n- The law introduces **criminal sanctions**: anyone who **disseminates AI-generated or manipulated content (e.g., deepfakes)** that causes *unjust harm* can face **1 to 5 years in prison**.\r\n\r\n- Incorporates data protection provisions: for example, children under 14 require parental consent for AI system use; minors 14+ may give their own consent (under conditions).\r\n\r\n- The legislation also strengthens rules on **copyright and data-training practices**: only works generated with “genuine human intellectual effort” are eligible for protection; mass data scraping or text & data mining (TDM) is limited to non-copyrighted content or authorised scientific uses.\r\n\r\n- New governance & supervisory authorities: Agency for Digital Italy (AgID) and National Cybersecurity Authority (ACN) **are designated key authorities under the EU AI Act.**\r\n\r\n> 🔗 Read[ the law here.](https://cdn.prod.website-files.com/601987a724bdae251872ed2c/68cd494c5f19ae0b85e96cff_AI_bill_1758267938.pdf)\r\n\r\n## OECD policy paper: *Mapping relevant data collection mechanisms for AI training*\r\n\r\nPublished 3 October 2025, the OECD’s latest policy paper examines the **various mechanisms used to collect data for training AI systems**, and proposes a taxonomy to support policy discussions on privacy, data governance and responsible AI development.\r\n\r\n*“When developing AI systems, practitioners often focus on model building, while sometimes underestimating the importance of analysing the diverse data collection mechanisms. However, the diversity of mechanisms used for data collection deserves closer attention* \"\r\n\r\n### 🔍 Key take-aways\r\n\r\n- AI model quality depends not just on model architecture but on the *origin, diversity and governance* of training data.\r\n\r\n- Data-collection mechanisms are categorised into two broad sources:\r\n\r\n  1. **Direct collection from individuals and organisations**: e.g., data provided by users, observed during interaction with digital services, or voluntary data donations.\r\n\r\n  2. **Collection from third-parties**: e.g., commercial data licensing, open-data initiatives, and large-scale web scraping.\r\n\r\n- Each mechanism has distinct implications for privacy, IP rights, transparency, traceability of datasets and the ability of individuals to exercise rights.\r\n\r\n- The paper highlights emerging roles for **privacy-enhancing technologies (PETs)** and **synthetic data** as tools to mitigate data governance and privacy risks.\r\n\r\n> 🔗 Read[ the paper here.](https://www.oecd.org/content/dam/oecd/en/publications/reports/2025/10/mapping-relevant-data-collection-mechanisms-for-ai-training_62921889/3264cd4c-en.pdf)\r\n\r\n## California regulates “AI Chatbots” after a series of teen suicides\r\n\r\nOn **October 13, 2025**, California became the first U.S. state to adopt legislation regulating **AI chatbots**, following several tragic cases of **teen suicides** involving emotional attachments formed with these programs. With this move, Governor **Gavin Newsom** is directly challenging the White House, which has so far resisted imposing national AI regulations.\r\n\r\nThe new law requires:\r\n\r\n- **Age verification** for chatbot users,\r\n\r\n- **Regular warning messages** reminding users they are interacting with a machine (every three hours for minors);\r\n\r\n- **Suicide-prevention protocols** integrated into conversational AI systems.\r\n\r\nOne of the key texts, **Bill SB243**, specifically targets chatbots designed to act as **companions or confidants**. It follows lawsuits filed in 2024 against the platform **Character.AI**, after the suicide of a 14-year-old who had developed a virtual relationship with a chatbot allegedly reinforcing his suicidal thoughts.\r\n\r\n## The CNIL explains how to oppose the reuse of your personal data for training conversational AI agents\r\n\r\nThe CNIL has published guidance showing how individuals can object to the reuse of their personal data in the training of AI chat-bots and conversational agents.\r\n\r\n**Key points:**\r\n\r\n- The guidance covers major platforms (e.g., Meta AI, Google Gemini) and explains how users can adjust account settings or submit a formal opposition request.\r\n\r\n- Disabling “activity” settings or submitting a right-to-object form may lead to loss of conversation history or other side-effects; the CNIL emphasises users should be aware of these trade-offs.\r\n\r\n- The CNIL explicitly notes it **is not taking a position** yet on whether the relevant processing fully complies with the General Data Protection Regulation (GDPR), rather it provides practical steps for users.\r\n\r\n> 🔗 For more information, [click here.](https://www.cnil.fr/fr/ia-comment-sopposer-la-reutilisation-de-ses-donnees-personnelles-entrainement-agent-conversationnel)\r\n\r\n## Reddit files second lawsuit over large-scale scraping of its content\r\n\r\nOn **October 22, 2025**, **Reddit** filed a lawsuit before the **U.S. Federal Court in New York** against **Perplexity** and three data-scraping companies (**Oxylabs**, **AWMProxy**, and **SerpApi**).The case concerns the **automated extraction (“scraping”) of massive volumes of Reddit data** using specialized software.\r\n\r\nAccording to the complaint, these companies **bypassed access restrictions and technical safeguards** to harvest Reddit content, including through **Google search result pages**, in order to **train artificial intelligence models**.\r\n\r\nReddit alleges:\r\n\r\n- **Violation of its Terms of Service**,\r\n\r\n- **Infringement of copyright** on user-generated content,\r\n\r\n- **Unlawful circumvention of technical protection measures**.\r\n\r\nThis marks **Reddit’s second lawsuit** of this kind, signaling a broader **legal strategy to assert the commercial value of its data** against AI companies relying on large-scale web scraping.\r\n\r\n> 🔗 For more information, [click here.](https://www.cnbc.com/2025/10/23/reddit-user-data-battle-ai-industry-sues-perplexity-scraping-posts-openai-chatgpt-google-gemini-lawsuit.html?msockid=2d90361560356b8a10d7209e61226a5a)","\u003Cp>Tired of general newsletters that skim over your real concerns? \u003Cstrong>DastraNews,\u003C/strong> offers legal and regulatory monitoring \u003Cstrong>specifically designed for DPOs, lawyers, and privacy professionals\u003C/strong>.\u003C/p>\r\n\u003Cp>Each month, we go beyond a simple recap: we select about ten decisions, news, or positions \u003Cstrong>that have a concrete impact on your missions and organizations\u003C/strong>.\u003C/p>\r\n\u003Cp>🎯 \u003Cstrong>Targeted, useful monitoring grounded in the real-world realities of data protection and AI.\u003C/strong>\u003C/p>\r\n\u003Cp>Here is our selection for \u003Cstrong>October 2025:\u003C/strong>\u003C/p>\r\n\u003Ch2 id=\"edpbs-2026-coordinated-enforcement-topic-transparency-obligations\">EDPB’s 2026 coordinated enforcement topic: transparency obligations\u003C/h2>\r\n\u003Cp>On \u003Cstrong>14 October 2025\u003C/strong>, the \u003Cstrong>European Data Protection Board (EDPB)\u003C/strong> announced that its next Coordinated Enforcement Framework (the fifth) will focus on \u003Cstrong>transparency and information obligations\u003C/strong> under \u003Cstrong>Articles 12, 13, and 14 of the GDPR\u003C/strong>.\u003C/p>\r\n\u003Cp>In each coordinated action, the \u003Cstrong>EDPB selects a common priority topic\u003C/strong> for national \u003Cstrong>Data Protection Authorities (DPAs)\u003C/strong> to investigate.\u003C/p>\r\n\u003Cp>In 2026, supervisory authorities across Europe will jointly assess how controllers and processors comply with their \u003Cstrong>duty to inform individuals\u003C/strong> when their data is processed.\u003C/p>\r\n\u003Cp>\u003Cstrong>Why it matters:\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Transparency is a \u003Cstrong>core principle\u003C/strong> of the GDPR: without clear information, individuals cannot effectively exercise their rights.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The outcomes of these national investigations are then \u003Cstrong>consolidated and analysed\u003C/strong> to provide a deeper, EU-wide understanding of the issue, enabling \u003Cstrong>targeted follow-up and enforcement\u003C/strong> at both national and European levels.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The initiative is expected to start in \u003Cstrong>2026\u003C/strong>, giving organizations some time (but not much) to prepare.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 For more information, \u003Ca href=\"https://www.edpb.europa.eu/news/news/2025/coordinated-enforcement-framework-edpb-selects-topic-2026_en\" rel=\"nofollow\">click here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"edpb-adopts-opinions-recommending-uk-adequacy-extension\">EDPB adopts opinions recommending UK adequacy extension\u003C/h2>\r\n\u003Cp>During its latest plenary, the European Data Protection Board (EDPB) adopted two opinions on the European Commission’s \u003Cstrong>draft decisions to extend the validity of the UK adequacy decisions\u003C/strong>, under both the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED)\u003Cstrong>, until December 2031.\u003C/strong>\u003Cbr />\r\n\u003Cbr />\r\nThis would allow EU organisations and authorities to \u003Cstrong>continue transferring personal data to the UK without additional safeguards.\u003C/strong>\u003Cbr />\r\n\u003Cbr />\r\nSo, the EDPB notes that most UK legal updates aim to clarify or facilitate compliance but flags areas requiring closer monitoring by the European Commission:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>Onward transfers: The UK’s new adequacy test (Data Use and Access Act 2025) lacks references to crucial safeguards like government access, individual redress, and independent supervision.\u003C/li>\r\n\u003Cli>Encryption concerns: Technical Capability Notices (TCNs) allowing circumvention of encryption could create systemic vulnerabilities.\u003C/li>\r\n\u003Cli>ICO restructuring: The new Information Commission model should be monitored for independence and enforcement capacity, though its transparency policy is welcomed.\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>🎯\u003Cstrong>The UK remains an adequate destination for EU data transfers until 2031, but only under strict, ongoing EU monitoring.\u003C/strong> Good news for organizations transferring personal data between the EU and the UK, but vigilance remains key.\u003C/p>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 For more information, \u003Ca href=\"https://www.edpb.europa.eu/news/news/2025/draft-uk-adequacy-decisions-edpb-adopts-opinions_en\" rel=\"nofollow\">click here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"public-consultation-on-the-draft-joint-guidelines-covering-the-interplay-between-the-digital-markets-act-dma-the-gdpr\">Public consultation on the draft joint guidelines covering the interplay between the Digital Markets Act (DMA) &amp; the GDPR\u003C/h2>\r\n\u003Cp>As of 9 October 2025, the European Commission and the European Data Protection Board (EDPB) have launched a public consultation inviting comments on draft \u003Cstrong>joint guidelines\u003C/strong> clarifying how the DMA and GDPR interact.\u003C/p>\r\n\u003Cp>The DMA targets large digital platforms (gatekeepers) and imposes obligations that often trigger GDPR processing. These guidelines aim to align both regulatory regimes.\u003C/p>\r\n\u003Cp>These guidelines are designed to help “gatekeepers” under the DMA understand and meet their GDPR-compliance obligations, especially where the DMA mandates data processing operations, such as combining user data, portability, or distribution of third-party apps.\u003C/p>\r\n\u003Cp>The consultation closes on \u003Cstrong>4 December 2025\u003C/strong>, with the final guidelines expected to be adopted in 2026.\u003C/p>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 Read the \u003Ca href=\"https://digital-markets-act.ec.europa.eu/document/download/8ba0913f-2778-4a6d-9c58-10f8c7ead009_en?filename=Joint_COM-EDPB_GLS_interplay_DMA_GDPR_for_public_consultation.pdf\" rel=\"nofollow\">draft guidelines here.\u003C/a>\u003C/p>\r\n\u003Cp>🔗 For more information, \u003Ca href=\"https://digital-markets-act.ec.europa.eu/consultation-joint-guidelines-interplay-between-dma-and-gdpr_en?utm_campaign=DPIA&amp;ut\" rel=\"nofollow\">click here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"experian-us-credit-and-data-trader-fined-2.7-million-for-gdpr-breaches\">Experian (US credit and data trader) fined €2.7 million for GDPR breaches\u003C/h2>\r\n\u003Cp>The Dutch data protection authority (AP) has imposed a fine of \u003Cstrong>€2.7 million\u003C/strong> on Experian Nederland B.V., a US credit and data trader, for violations of the GDPR.\u003C/p>\r\n\u003Cp>Key findings:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Experian collected and processed extensive personal and sensitive data (including from energy, telecoms and public registers) to build credit reports, often without sufficient transparency to the individuals concerned. The data included negative payment behavior, outstanding debts, and bankruptcy information used to generate credit assessments supplied to service providers and sellers.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Complaints from consumers who faced higher deposits or were denied services prompted the AP investigation.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The company failed to properly inform data subjects of its processing operations (violations of Articles 12 &amp; 14) and relied on the legal basis of “legitimate interests” without demonstrating necessity or balance in favour of individual rights.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Experian has ceased its consumer credit rating operations in the Netherlands as of January 2025 and committed to deleting the related database by the end of this year.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 For more information, \u003Ca href=\"https://www.autoriteitpersoonsgegevens.nl/actueel/experian-krijgt-boete-van-27-miljoen-euro-voor-privacyovertredingen\" rel=\"nofollow\">click here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"the-eu-ai-act-enters-its-operational-phase-european-commission-launches-official-information-platform\">The EU AI Act enters its operational phase: European Commission launches official information platform\u003C/h2>\r\n\u003Cp>On \u003Cstrong>8 October 2025\u003C/strong>, the \u003Cstrong>European Commission\u003C/strong> launched the \u003Cstrong>AI Act Single Information Platform\u003C/strong>, marking the start of the EU AI Act’s operational phase. This \u003Cstrong>one-stop platform\u003C/strong> is designed to help both public and private organisations \u003Cstrong>understand and implement\u003C/strong> the regulation effectively.\u003C/p>\r\n\u003Ch4 id=\"why-it-matters\">🔍 Why it matters\u003C/h4>\r\n\u003Cp>Since its \u003Cstrong>partial entry into force on 1 August 2024\u003C/strong>, the AI Act has been reshaping how AI is developed, deployed, and governed across the EU. Yet many organisations still struggle with one key question: \u003Cem>where to start?\u003C/em>\u003C/p>\r\n\u003Cp>The new platform aims to provide clarity and practical tools:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>\u003Cstrong>AI Act Explorer\u003C/strong>: an interactive interface to browse the regulation and annexes;\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Compliance Checker\u003C/strong>: a self-assessment tool to identify applicable obligations;\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>National Resources Hub\u003C/strong>: to access local initiatives and competent authorities;\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Service Desk\u003C/strong>: direct expert support from the \u003Cstrong>European AI Office\u003C/strong>.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>It also centralises the \u003Cstrong>official FAQs\u003C/strong> and guidance managed by the \u003Cstrong>AI Act Service Desk\u003C/strong>, offering the first unified reference to distinguish between immediate and future obligations and to encourage experimentation in compliant environments.\u003C/p>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 \u003Cstrong>Access the platform:\u003C/strong> \u003Ca href=\"https://ai-act-service-desk.ec.europa.eu/en\" rel=\"nofollow\">AI Act Single Information Platform\u003C/a>\u003C/p>\r\n\u003Cp>🔗 \u003Cstrong>Access the FAQ:\u003C/strong> \u003Ca href=\"https://ai-act-service-desk.ec.europa.eu/en/faq\" rel=\"nofollow\">List of FAQs\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"italys-new-ai-law-first-of-its-kind-in-the-eu\">Italy’s new AI law: first of its kind in the EU\u003C/h2>\r\n\u003Cp>On \u003Cstrong>23 September 2025\u003C/strong>, Italy’s Parliament approved Law No. 132/2025 (initially Bill 1146-B) regulating artificial intelligence, which will enter into force on \u003Cstrong>10 October 2025\u003C/strong>.\u003C/p>\r\n\u003Cp>Italy thereby becomes the \u003Cstrong>first EU Member State\u003C/strong> to adopt a comprehensive national AI law. A central aspect is its explicit alignment with the EU AI Act. The government is expected to adopt decrees aimed at harmonising national law with the EU regulation within twelve months.\u003C/p>\r\n\u003Ch4 id=\"what-it-introduces\">🔍 What it introduces\u003C/h4>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>Reinforces a \u003Cstrong>human-centred approach\u003C/strong>: AI systems must respect fundamental rights, transparency, security, data protection, non-discrimination, gender equality and sustainability.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The law introduces \u003Cstrong>criminal sanctions\u003C/strong>: anyone who \u003Cstrong>disseminates AI-generated or manipulated content (e.g., deepfakes)\u003C/strong> that causes \u003Cem>unjust harm\u003C/em> can face \u003Cstrong>1 to 5 years in prison\u003C/strong>.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Incorporates data protection provisions: for example, children under 14 require parental consent for AI system use; minors 14+ may give their own consent (under conditions).\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The legislation also strengthens rules on \u003Cstrong>copyright and data-training practices\u003C/strong>: only works generated with “genuine human intellectual effort” are eligible for protection; mass data scraping or text &amp; data mining (TDM) is limited to non-copyrighted content or authorised scientific uses.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>New governance &amp; supervisory authorities: Agency for Digital Italy (AgID) and National Cybersecurity Authority (ACN) \u003Cstrong>are designated key authorities under the EU AI Act.\u003C/strong>\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 Read\u003Ca href=\"https://cdn.prod.website-files.com/601987a724bdae251872ed2c/68cd494c5f19ae0b85e96cff_AI_bill_1758267938.pdf\" rel=\"nofollow\"> the law here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"oecd-policy-paper-mapping-relevant-data-collection-mechanisms-for-ai-training\">OECD policy paper: \u003Cem>Mapping relevant data collection mechanisms for AI training\u003C/em>\u003C/h2>\r\n\u003Cp>Published 3 October 2025, the OECD’s latest policy paper examines the \u003Cstrong>various mechanisms used to collect data for training AI systems\u003C/strong>, and proposes a taxonomy to support policy discussions on privacy, data governance and responsible AI development.\u003C/p>\r\n\u003Cp>\u003Cem>“When developing AI systems, practitioners often focus on model building, while sometimes underestimating the importance of analysing the diverse data collection mechanisms. However, the diversity of mechanisms used for data collection deserves closer attention\u003C/em> \"\u003C/p>\r\n\u003Ch3 id=\"key-take-aways\">🔍 Key take-aways\u003C/h3>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>AI model quality depends not just on model architecture but on the \u003Cem>origin, diversity and governance\u003C/em> of training data.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Data-collection mechanisms are categorised into two broad sources:\u003C/p>\r\n\u003Col>\r\n\u003Cli>\u003Cp>\u003Cstrong>Direct collection from individuals and organisations\u003C/strong>: e.g., data provided by users, observed during interaction with digital services, or voluntary data donations.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Collection from third-parties\u003C/strong>: e.g., commercial data licensing, open-data initiatives, and large-scale web scraping.\u003C/p>\r\n\u003C/li>\r\n\u003C/ol>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Each mechanism has distinct implications for privacy, IP rights, transparency, traceability of datasets and the ability of individuals to exercise rights.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The paper highlights emerging roles for \u003Cstrong>privacy-enhancing technologies (PETs)\u003C/strong> and \u003Cstrong>synthetic data\u003C/strong> as tools to mitigate data governance and privacy risks.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 Read\u003Ca href=\"https://www.oecd.org/content/dam/oecd/en/publications/reports/2025/10/mapping-relevant-data-collection-mechanisms-for-ai-training_62921889/3264cd4c-en.pdf\" rel=\"nofollow\"> the paper here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"california-regulates-ai-chatbots-after-a-series-of-teen-suicides\">California regulates “AI Chatbots” after a series of teen suicides\u003C/h2>\r\n\u003Cp>On \u003Cstrong>October 13, 2025\u003C/strong>, California became the first U.S. state to adopt legislation regulating \u003Cstrong>AI chatbots\u003C/strong>, following several tragic cases of \u003Cstrong>teen suicides\u003C/strong> involving emotional attachments formed with these programs. With this move, Governor \u003Cstrong>Gavin Newsom\u003C/strong> is directly challenging the White House, which has so far resisted imposing national AI regulations.\u003C/p>\r\n\u003Cp>The new law requires:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>\u003Cstrong>Age verification\u003C/strong> for chatbot users,\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Regular warning messages\u003C/strong> reminding users they are interacting with a machine (every three hours for minors);\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Suicide-prevention protocols\u003C/strong> integrated into conversational AI systems.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>One of the key texts, \u003Cstrong>Bill SB243\u003C/strong>, specifically targets chatbots designed to act as \u003Cstrong>companions or confidants\u003C/strong>. It follows lawsuits filed in 2024 against the platform \u003Cstrong>Character.AI\u003C/strong>, after the suicide of a 14-year-old who had developed a virtual relationship with a chatbot allegedly reinforcing his suicidal thoughts.\u003C/p>\r\n\u003Ch2 id=\"the-cnil-explains-how-to-oppose-the-reuse-of-your-personal-data-for-training-conversational-ai-agents\">The CNIL explains how to oppose the reuse of your personal data for training conversational AI agents\u003C/h2>\r\n\u003Cp>The CNIL has published guidance showing how individuals can object to the reuse of their personal data in the training of AI chat-bots and conversational agents.\u003C/p>\r\n\u003Cp>\u003Cstrong>Key points:\u003C/strong>\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>The guidance covers major platforms (e.g., Meta AI, Google Gemini) and explains how users can adjust account settings or submit a formal opposition request.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>Disabling “activity” settings or submitting a right-to-object form may lead to loss of conversation history or other side-effects; the CNIL emphasises users should be aware of these trade-offs.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>The CNIL explicitly notes it \u003Cstrong>is not taking a position\u003C/strong> yet on whether the relevant processing fully complies with the General Data Protection Regulation (GDPR), rather it provides practical steps for users.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 For more information, \u003Ca href=\"https://www.cnil.fr/fr/ia-comment-sopposer-la-reutilisation-de-ses-donnees-personnelles-entrainement-agent-conversationnel\" rel=\"nofollow\">click here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n\u003Ch2 id=\"reddit-files-second-lawsuit-over-large-scale-scraping-of-its-content\">Reddit files second lawsuit over large-scale scraping of its content\u003C/h2>\r\n\u003Cp>On \u003Cstrong>October 22, 2025\u003C/strong>, \u003Cstrong>Reddit\u003C/strong> filed a lawsuit before the \u003Cstrong>U.S. Federal Court in New York\u003C/strong> against \u003Cstrong>Perplexity\u003C/strong> and three data-scraping companies (\u003Cstrong>Oxylabs\u003C/strong>, \u003Cstrong>AWMProxy\u003C/strong>, and \u003Cstrong>SerpApi\u003C/strong>).\u003Cbr />\r\nThe case concerns the \u003Cstrong>automated extraction (“scraping”) of massive volumes of Reddit data\u003C/strong> using specialized software.\u003C/p>\r\n\u003Cp>According to the complaint, these companies \u003Cstrong>bypassed access restrictions and technical safeguards\u003C/strong> to harvest Reddit content, including through \u003Cstrong>Google search result pages\u003C/strong>, in order to \u003Cstrong>train artificial intelligence models\u003C/strong>.\u003C/p>\r\n\u003Cp>Reddit alleges:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>\u003Cstrong>Violation of its Terms of Service\u003C/strong>,\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Infringement of copyright\u003C/strong> on user-generated content,\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Unlawful circumvention of technical protection measures\u003C/strong>.\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>This marks \u003Cstrong>Reddit’s second lawsuit\u003C/strong> of this kind, signaling a broader \u003Cstrong>legal strategy to assert the commercial value of its data\u003C/strong> against AI companies relying on large-scale web scraping.\u003C/p>\r\n\u003Cblockquote>\r\n\u003Cp>🔗 For more information, \u003Ca href=\"https://www.cnbc.com/2025/10/23/reddit-user-data-battle-ai-industry-sues-perplexity-scraping-posts-openai-chatgpt-google-gemini-lawsuit.html?msockid=2d90361560356b8a10d7209e61226a5a\" rel=\"nofollow\">click here.\u003C/a>\u003C/p>\r\n\u003C/blockquote>\r\n","DastraNews: what happened in Privacy & AI in October? ","Privacy & AI insights from the Dastra hub: actionable updates for pros who work daily in the field.",1905,11,0,null,"en","dastranews-what-happened-in-october",false,"Published",{"id":18,"displayName":19,"avatarUrl":20,"bio":12,"blogUrl":12,"color":12,"userId":18,"creationDate":21},20352,"Leïla Sayssa","https://static.dastra.eu/tenant-3/avatar/20352/TDYeY3C8Rz1lLE/dpo-avatar-h01-150.png","2025-03-03T11:08:22","2025-10-29T10:30:00","2025-10-28T15:52:16.1182188","2025-12-09T09:10:17.6348044",{"id":26,"name":27,"description":28,"url":29,"color":30,"parentId":12,"count":12,"imageUrl":12,"parent":12,"order":11,"translations":31},2,"Blog","A list of curated articles provided by the community","article","#28449a",[32,35,38],{"lang":33,"name":27,"description":34},"fr","Une liste d'articles rédigés par la communauté",{"lang":36,"name":27,"description":37},"es","Una lista de artículos escritos por la comunidad",{"lang":39,"name":27,"description":40},"de","Eine Liste von Artikeln, die von der Community verfasst wurden",[42],{"id":26,"name":27,"description":28,"url":29,"color":30,"parentId":12,"count":12,"imageUrl":12,"parent":12,"order":11,"translations":43},[44,45,46],{"lang":33,"name":27,"description":34},{"lang":36,"name":27,"description":37},{"lang":39,"name":27,"description":40},[],"https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-original.jpg",[50,51,52,53,54,55,56],"https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-1000.webp","https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu.webp","https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-1500.webp","https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-800.webp","https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-600.webp","https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-300.webp","https://static.dastra.eu/content/3d585c25-ee45-4217-9c86-31357064de4d/dastractu-100.webp",59682]