[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"article_59455":3},{"tableOfContents":4,"markDownContent":5,"htmlContent":6,"metaTitle":7,"metaDescription":8,"wordCount":9,"readTime":10,"title":7,"nbDownloads":11,"excerpt":12,"lang":13,"url":14,"intro":15,"featured":4,"state":16,"author":17,"authorId":18,"datePublication":22,"dateCreation":23,"dateUpdate":24,"mainCategory":25,"categories":41,"metaDatas":47,"imageUrl":48,"imageThumbUrls":49,"id":57},false,"It’s been nearly a year since the EU Artificial Intelligence Act officially entered into force. While certain milestones, such as the publication of the Code of Practice for General-Purpose AI (GPAI), have experienced some delays, the next major step is now imminent.\r\n\r\n![](https://static.dastra.eu/richtext/3d4d7d56-0854-4313-b417-f107bb5f2820/image-original.png)On **August 2, 2025**, several key provisions of the AI Act will become **legally binding**. These include:\r\n\r\n- **The governance rules, namely the governance at Union Level & the national competent authorities; and**\r\n\r\n- **Obligations for general-purpose AI models**\r\n\r\nBelow, we explore what these changes mean in practice and what organizations should be doing to prepare. This **includes the recently released[ AI Office template for the public summary ](https://digital-strategy.ec.europa.eu/en/library/explanatory-notice-and-template-public-summary-training-content-general-purpose-ai-models)of training content for GPAI models, as explained below.**\r\n\r\nFor more information on the timeline, refer here to [the official timeline.](https://artificialintelligenceact.eu/implementation-timeline/)\r\n\r\n## On the GPAI providers' obligations\r\n\r\nProviders of GPAI models, whether original developers or those who modify the GPAI model substantially before placing it on the market, have to comply with the [Chapter V of the AI Act](https://artificialintelligenceact.eu/chapter/5/), starting August 2nd.\r\n\r\nCheck out in [this article whether or not you are a considered to be a provider of a GPAI.](https://www.dastra.eu/en/article/building-a-GPAI-you-might-be-the-provider/59448)\r\n\r\n\u003Ctable style=\"min-width: 100px\">\r\n\u003Ccolgroup>\u003Ccol style=\"min-width: 25px\">\u003Ccol style=\"min-width: 25px\">\u003Ccol style=\"min-width: 25px\">\u003Ccol style=\"min-width: 25px\">\u003C/colgroup>\u003Ctbody>\u003Ctr>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Model category\u003C/strong>\u003C/p>\u003C/th>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Scope &amp; definition\u003C/strong>\u003C/p>\u003C/th>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Legal basis\u003C/strong>\u003C/p>\u003C/th>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Obligations of the provider\u003C/strong>\u003C/p>\u003C/th>\u003C/tr>\u003Ctr>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Provider of a \u003C/strong>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://www.dastra.eu/en/guide/general-purpose-ai-gpai-model/59460\">\u003Cstrong>GPAI\u003C/strong>\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"nofollow\" href=\"https://artificialintelligenceact.eu/article/3/\">Article 3(63) of the AI Act\u003C/a>\u003C/p>\u003Cul class=\"tight\" data-tight=\"true\">\u003Cli>\u003Cp>\u003Cstrong>The training compute of the model is greater than 10^23 FLOP.\u003C/strong>\u003C/p>\u003C/li>\u003Cli>\u003Cp>\u003Cstrong>It is capable of performing a wide range of distinct tasks\u003C/strong> like generate language (text or audio), text-to-image or text-to-video.\u003C/p>\u003C/li>\u003C/ul>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://artificialintelligenceact.eu/article/53/\">Article 53 of the AI Act\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cul class=\"tight\" data-tight=\"true\">\u003Cli>\u003Cp>Develop and maintain up-to-date \u003Cstrong>technical documentation\u003C/strong> for the model, covering its \u003Cstrong>training and testing processes\u003C/strong>, as well as the \u003Cstrong>results of performance evaluations\u003C/strong>. This documentation must be made available upon request to the AI Office and the competent authorities.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Prepare comprehensive \u003Cstrong>documentation and information packages\u003C/strong> for downstream providers intending to \u003Cstrong>integrate the model\u003C/strong> into their own AI systems.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Implement and maintain a \u003Cstrong>compliance policy\u003C/strong> aligned with \u003Cstrong>EU copyright law\u003C/strong>, ensuring lawful use of protected content during model development.\u003C/p>\u003C/li>\u003Cli>\u003Cp>\u003Cstrong>Publish a detailed summary\u003C/strong> of the \u003Cstrong>training data content\u003C/strong>, following the format provided by the \u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://digital-strategy.ec.europa.eu/en/library/explanatory-notice-and-template-public-summary-training-content-general-purpose-ai-models\">\u003Cstrong>EU AI Office template\u003C/strong>, \u003C/a>to ensure transparency and accountability.\u003C/p>\u003C/li>\u003C/ul>\u003C/td>\u003C/tr>\u003Ctr>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Provider of a GPAI with \u003C/strong>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://www.dastra.eu/en/guide/systemic-risk/59461\">\u003Cstrong>systemic risk\u003C/strong>\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://artificialintelligenceact.eu/article/3/\">Article 3(65) of the AI Act\u003C/a>\u003C/p>\u003Cp>\u003C/p>\u003Cul>\u003Cli>\u003Cp>has \u003Cstrong>hight-impact capabilities,\u003C/strong> namely that \u003Cem>\"\u003C/em>\u003Cstrong>\u003Cem>match or exceed those recorded in the most advanced models\u003C/em>\u003C/strong>\".\u003Cbr>This is \u003Cstrong>presumed\u003C/strong> when the model’s \u003Cstrong>training compute exceeds 10^25 FLOPs\u003C/strong>\u003C/p>\u003C/li>\u003Cli>\u003Cp>desginated as such \u003Cstrong>ex officio\u003C/strong> by the Commission or based on alerts of its high-impact capabilities from the scientific panel.\u003C/p>\u003C/li>\u003C/ul>\u003Cp>A list of models presenting systemic risk will be published and regularly updated, ensuring enhanced oversight of their deployment.\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://artificialintelligenceact.eu/article/55/\">Article 55 of the AI Act\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cul class=\"tight\" data-tight=\"true\">\u003Cli>\u003Cp>Conduct \u003Cstrong>state-of-the-art model evaluations\u003C/strong> using standardized protocols and tools, including the implementation and documentation of \u003Cstrong>adversarial testing\u003C/strong> to identify and mitigate potential \u003Cstrong>systemic risks\u003C/strong>.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Continuously \u003Cstrong>assess and mitigate systemic risks\u003C/strong>, including identifying their root causes and implementing appropriate risk-reduction strategies.\u003C/p>\u003C/li>\u003Cli>\u003Cp>\u003Cstrong>Monitor, document, and report\u003C/strong> any \u003Cstrong>serious incidents\u003C/strong> and proposed \u003Cstrong>corrective actions\u003C/strong> to the \u003Cstrong>AI Office\u003C/strong> and relevant \u003Cstrong>national authorities\u003C/strong> \u003Cstrong>without undue delay\u003C/strong>.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Maintain and enforce an appropriate level of \u003Cstrong>cybersecurity\u003C/strong> to protect the integrity, availability, and confidentiality of the model and its components.\u003C/p>\u003C/li>\u003C/ul>\u003C/td>\u003C/tr>\u003C/tbody>\r\n\u003C/table>\r\n\r\n## Outside the EU?\r\n\r\nFor **non-EU providers, it is a requirement to appoint an authorised representative** based in the EU (Article 54 of the AI Act).\r\n\r\nThis representative acts as the main point of contact for European authorities, reviews the technical documentation, and cooperates with the authorities to ensure compliance with regulatory obligations.\r\n\r\n## The code of practice: a voluntary pathway toward compliance\r\n\r\nThis should be read in conjunction with the European Commission's **GPAI Code of Practice**, which provides a key reference framework for stakeholders across the AI value chain, from large-scale model developers to start-ups and SMEs, seeking to align with upcoming obligations related to GPAI under the AI Act.\r\n\r\nThe Code is a **voluntary tool** intended to help GPAI model providers operating in or targeting the EU market demonstrate **compliance with Articles 53 and 55** of the AI Act.\r\n\r\n**For a full overview of the Code and its practical implications, see our [detailed breakdown here](https://www.dastra.eu/en/article/general-purpose-ai-code-of-practice-what-you-need-to-know/59438).**\r\n\r\nHowever, Code of Practice remains **subject to assessment** by **Member States and the European Commission**, which may ultimately approve it through an **adequacy decision** (by the **AI Office** and the **AI Board**).\r\n\r\nCertain GPAI model providers take the prudent approach which is to await the outcome of the **AI Office and AI Board’s assessment**, and preferably the adoption of the **Commission’s implementing act**, before making a decision to adhere to the Code.\r\n\r\n## Who's in charge? The race to appoint surveillance authorities\r\n\r\n**Member States must designate market surveillance authorities**. These bodies will be empowered to investigate, prohibit, or sanction use of banned AI practices.\r\n\r\n- However, in many Member States (e.g., Spain), these authorities have **not yet been formally appointed**. It is still the case in France.\r\n\r\n- In the interim, **data protection authorities (DPAs)** retain the power to act if prohibited AI systems **process personal data** in breach of GDPR. The[ AEPD recently confirmed their position.](https://www.aepd.es/prensa-y-comunicacion/notas-de-prensa/la-aepd-recuerda-que-ya-puede-actuar-ante-sistemas-de-ia?mkt_tok=MTM4LUVaTS0wNDIAAAGbs7_dasuaJGa9KK98JvZuVADVU0cUr_-S8RU0lwYndcjQHEYdfRPz0wXKdlFqpBnMhqClLflFkrX3ZKyn5OqKdTyB_E-dCleMB10XVM7mAJd5wQ)","\u003Cp>It’s been nearly a year since the EU Artificial Intelligence Act officially entered into force. While certain milestones, such as the publication of the Code of Practice for General-Purpose AI (GPAI), have experienced some delays, the next major step is now imminent.\u003C/p>\r\n\u003Cp>\u003Cimg loading=\"lazy\"  src=\"https://static.dastra.eu/richtext/3d4d7d56-0854-4313-b417-f107bb5f2820/image-original.png\" alt=\"\" />On \u003Cstrong>August 2, 2025\u003C/strong>, several key provisions of the AI Act will become \u003Cstrong>legally binding\u003C/strong>. These include:\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>\u003Cstrong>The governance rules, namely the governance at Union Level &amp; the national competent authorities; and\u003C/strong>\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>\u003Cstrong>Obligations for general-purpose AI models\u003C/strong>\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n\u003Cp>Below, we explore what these changes mean in practice and what organizations should be doing to prepare. This \u003Cstrong>includes the recently released\u003Ca href=\"https://digital-strategy.ec.europa.eu/en/library/explanatory-notice-and-template-public-summary-training-content-general-purpose-ai-models\" rel=\"nofollow\"> AI Office template for the public summary \u003C/a>of training content for GPAI models, as explained below.\u003C/strong>\u003C/p>\r\n\u003Cp>For more information on the timeline, refer here to \u003Ca href=\"https://artificialintelligenceact.eu/implementation-timeline/\" rel=\"nofollow\">the official timeline.\u003C/a>\u003C/p>\r\n\u003Ch2 id=\"on-the-gpai-providers-obligations\">On the GPAI providers' obligations\u003C/h2>\r\n\u003Cp>Providers of GPAI models, whether original developers or those who modify the GPAI model substantially before placing it on the market, have to comply with the \u003Ca href=\"https://artificialintelligenceact.eu/chapter/5/\" rel=\"nofollow\">Chapter V of the AI Act\u003C/a>, starting August 2nd.\u003C/p>\r\n\u003Cp>Check out in \u003Ca href=\"https://www.dastra.eu/en/article/building-a-GPAI-you-might-be-the-provider/59448\">this article whether or not you are a considered to be a provider of a GPAI.\u003C/a>\u003C/p>\r\n\u003Ctable style=\"min-width: 100px\">\r\n\u003Ccolgroup>\u003Ccol style=\"min-width: 25px\">\u003Ccol style=\"min-width: 25px\">\u003Ccol style=\"min-width: 25px\">\u003Ccol style=\"min-width: 25px\">\u003C/colgroup>\u003Ctbody>\u003Ctr>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Model category\u003C/strong>\u003C/p>\u003C/th>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Scope &amp; definition\u003C/strong>\u003C/p>\u003C/th>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Legal basis\u003C/strong>\u003C/p>\u003C/th>\u003Cth colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Obligations of the provider\u003C/strong>\u003C/p>\u003C/th>\u003C/tr>\u003Ctr>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Provider of a \u003C/strong>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://www.dastra.eu/en/guide/general-purpose-ai-gpai-model/59460\">\u003Cstrong>GPAI\u003C/strong>\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"nofollow\" href=\"https://artificialintelligenceact.eu/article/3/\">Article 3(63) of the AI Act\u003C/a>\u003C/p>\u003Cul class=\"tight\" data-tight=\"true\">\u003Cli>\u003Cp>\u003Cstrong>The training compute of the model is greater than 10^23 FLOP.\u003C/strong>\u003C/p>\u003C/li>\u003Cli>\u003Cp>\u003Cstrong>It is capable of performing a wide range of distinct tasks\u003C/strong> like generate language (text or audio), text-to-image or text-to-video.\u003C/p>\u003C/li>\u003C/ul>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://artificialintelligenceact.eu/article/53/\" rel=\"nofollow\">Article 53 of the AI Act\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cul class=\"tight\" data-tight=\"true\">\u003Cli>\u003Cp>Develop and maintain up-to-date \u003Cstrong>technical documentation\u003C/strong> for the model, covering its \u003Cstrong>training and testing processes\u003C/strong>, as well as the \u003Cstrong>results of performance evaluations\u003C/strong>. This documentation must be made available upon request to the AI Office and the competent authorities.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Prepare comprehensive \u003Cstrong>documentation and information packages\u003C/strong> for downstream providers intending to \u003Cstrong>integrate the model\u003C/strong> into their own AI systems.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Implement and maintain a \u003Cstrong>compliance policy\u003C/strong> aligned with \u003Cstrong>EU copyright law\u003C/strong>, ensuring lawful use of protected content during model development.\u003C/p>\u003C/li>\u003Cli>\u003Cp>\u003Cstrong>Publish a detailed summary\u003C/strong> of the \u003Cstrong>training data content\u003C/strong>, following the format provided by the \u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://digital-strategy.ec.europa.eu/en/library/explanatory-notice-and-template-public-summary-training-content-general-purpose-ai-models\" rel=\"nofollow\">\u003Cstrong>EU AI Office template\u003C/strong>, \u003C/a>to ensure transparency and accountability.\u003C/p>\u003C/li>\u003C/ul>\u003C/td>\u003C/tr>\u003Ctr>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Cstrong>Provider of a GPAI with \u003C/strong>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://www.dastra.eu/en/guide/systemic-risk/59461\">\u003Cstrong>systemic risk\u003C/strong>\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://artificialintelligenceact.eu/article/3/\" rel=\"nofollow\">Article 3(65) of the AI Act\u003C/a>\u003C/p>\u003Cp>\u003C/p>\u003Cul>\u003Cli>\u003Cp>has \u003Cstrong>hight-impact capabilities,\u003C/strong> namely that \u003Cem>\"\u003C/em>\u003Cstrong>\u003Cem>match or exceed those recorded in the most advanced models\u003C/em>\u003C/strong>\".\u003Cbr>This is \u003Cstrong>presumed\u003C/strong> when the model’s \u003Cstrong>training compute exceeds 10^25 FLOPs\u003C/strong>\u003C/p>\u003C/li>\u003Cli>\u003Cp>desginated as such \u003Cstrong>ex officio\u003C/strong> by the Commission or based on alerts of its high-impact capabilities from the scientific panel.\u003C/p>\u003C/li>\u003C/ul>\u003Cp>A list of models presenting systemic risk will be published and regularly updated, ensuring enhanced oversight of their deployment.\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cp>\u003Ca rel=\"noopener noreferrer nofollow\" href=\"https://artificialintelligenceact.eu/article/55/\" rel=\"nofollow\">Article 55 of the AI Act\u003C/a>\u003C/p>\u003C/td>\u003Ctd colspan=\"1\" rowspan=\"1\">\u003Cul class=\"tight\" data-tight=\"true\">\u003Cli>\u003Cp>Conduct \u003Cstrong>state-of-the-art model evaluations\u003C/strong> using standardized protocols and tools, including the implementation and documentation of \u003Cstrong>adversarial testing\u003C/strong> to identify and mitigate potential \u003Cstrong>systemic risks\u003C/strong>.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Continuously \u003Cstrong>assess and mitigate systemic risks\u003C/strong>, including identifying their root causes and implementing appropriate risk-reduction strategies.\u003C/p>\u003C/li>\u003Cli>\u003Cp>\u003Cstrong>Monitor, document, and report\u003C/strong> any \u003Cstrong>serious incidents\u003C/strong> and proposed \u003Cstrong>corrective actions\u003C/strong> to the \u003Cstrong>AI Office\u003C/strong> and relevant \u003Cstrong>national authorities\u003C/strong> \u003Cstrong>without undue delay\u003C/strong>.\u003C/p>\u003C/li>\u003Cli>\u003Cp>Maintain and enforce an appropriate level of \u003Cstrong>cybersecurity\u003C/strong> to protect the integrity, availability, and confidentiality of the model and its components.\u003C/p>\u003C/li>\u003C/ul>\u003C/td>\u003C/tr>\u003C/tbody>\r\n\u003C/table>\r\n\u003Ch2 id=\"outside-the-eu\">Outside the EU?\u003C/h2>\r\n\u003Cp>For \u003Cstrong>non-EU providers, it is a requirement to appoint an authorised representative\u003C/strong> based in the EU (Article 54 of the AI Act).\u003C/p>\r\n\u003Cp>This representative acts as the main point of contact for European authorities, reviews the technical documentation, and cooperates with the authorities to ensure compliance with regulatory obligations.\u003C/p>\r\n\u003Ch2 id=\"the-code-of-practice-a-voluntary-pathway-toward-compliance\">The code of practice: a voluntary pathway toward compliance\u003C/h2>\r\n\u003Cp>This should be read in conjunction with the European Commission's \u003Cstrong>GPAI Code of Practice\u003C/strong>, which provides a key reference framework for stakeholders across the AI value chain, from large-scale model developers to start-ups and SMEs, seeking to align with upcoming obligations related to GPAI under the AI Act.\u003C/p>\r\n\u003Cp>The Code is a \u003Cstrong>voluntary tool\u003C/strong> intended to help GPAI model providers operating in or targeting the EU market demonstrate \u003Cstrong>compliance with Articles 53 and 55\u003C/strong> of the AI Act.\u003C/p>\r\n\u003Cp>\u003Cstrong>For a full overview of the Code and its practical implications, see our \u003Ca href=\"https://www.dastra.eu/en/article/general-purpose-ai-code-of-practice-what-you-need-to-know/59438\">detailed breakdown here\u003C/a>.\u003C/strong>\u003C/p>\r\n\u003Cp>However, Code of Practice remains \u003Cstrong>subject to assessment\u003C/strong> by \u003Cstrong>Member States and the European Commission\u003C/strong>, which may ultimately approve it through an \u003Cstrong>adequacy decision\u003C/strong> (by the \u003Cstrong>AI Office\u003C/strong> and the \u003Cstrong>AI Board\u003C/strong>).\u003C/p>\r\n\u003Cp>Certain GPAI model providers take the prudent approach which is to await the outcome of the \u003Cstrong>AI Office and AI Board’s assessment\u003C/strong>, and preferably the adoption of the \u003Cstrong>Commission’s implementing act\u003C/strong>, before making a decision to adhere to the Code.\u003C/p>\r\n\u003Ch2 id=\"whos-in-charge-the-race-to-appoint-surveillance-authorities\">Who's in charge? The race to appoint surveillance authorities\u003C/h2>\r\n\u003Cp>\u003Cstrong>Member States must designate market surveillance authorities\u003C/strong>. These bodies will be empowered to investigate, prohibit, or sanction use of banned AI practices.\u003C/p>\r\n\u003Cul>\r\n\u003Cli>\u003Cp>However, in many Member States (e.g., Spain), these authorities have \u003Cstrong>not yet been formally appointed\u003C/strong>. It is still the case in France.\u003C/p>\r\n\u003C/li>\r\n\u003Cli>\u003Cp>In the interim, \u003Cstrong>data protection authorities (DPAs)\u003C/strong> retain the power to act if prohibited AI systems \u003Cstrong>process personal data\u003C/strong> in breach of GDPR. The\u003Ca href=\"https://www.aepd.es/prensa-y-comunicacion/notas-de-prensa/la-aepd-recuerda-que-ya-puede-actuar-ante-sistemas-de-ia?mkt_tok=MTM4LUVaTS0wNDIAAAGbs7_dasuaJGa9KK98JvZuVADVU0cUr_-S8RU0lwYndcjQHEYdfRPz0wXKdlFqpBnMhqClLflFkrX3ZKyn5OqKdTyB_E-dCleMB10XVM7mAJd5wQ\" rel=\"nofollow\"> AEPD recently confirmed their position.\u003C/a>\u003C/p>\r\n\u003C/li>\r\n\u003C/ul>\r\n","AI Act phase two: what changes on August 2nd? ","AI Act phase two: GPAI providers must prepare for new obligations effective August 2, 2025.\r\n",1251,7,0,null,"en","ai-act-phase-two-what-changes-on-august","As the second enforcement phase of the AI Act approaches, GPAI model providers must prepare to meet a new set of regulatory obligations taking effect on August 2, 2025.","Published",{"id":18,"displayName":19,"avatarUrl":20,"bio":12,"blogUrl":12,"color":12,"userId":18,"creationDate":21},20352,"Leïla Sayssa","https://static.dastra.eu/tenant-3/avatar/20352/TDYeY3C8Rz1lLE/dpo-avatar-h01-150.png","2025-03-03T11:08:22","2025-07-25T07:00:00","2025-07-23T12:40:34.6309775","2025-08-19T13:32:58.1143357",{"id":26,"name":27,"description":28,"url":29,"color":30,"parentId":12,"count":12,"imageUrl":12,"parent":12,"order":11,"translations":31},2,"Blog","A list of curated articles provided by the community","article","#28449a",[32,35,38],{"lang":33,"name":27,"description":34},"fr","Une liste d'articles rédigés par la communauté",{"lang":36,"name":27,"description":37},"es","Una lista de artículos escritos por la comunidad",{"lang":39,"name":27,"description":40},"de","Eine Liste von Artikeln, die von der Community verfasst wurden",[42],{"id":26,"name":27,"description":28,"url":29,"color":30,"parentId":12,"count":12,"imageUrl":12,"parent":12,"order":11,"translations":43},[44,45,46],{"lang":33,"name":27,"description":34},{"lang":36,"name":27,"description":37},{"lang":39,"name":27,"description":40},[],"https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-original.jpg",[50,51,52,53,54,55,56],"https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-1000.webp","https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25.webp","https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-1500.webp","https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-800.webp","https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-600.webp","https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-300.webp","https://static.dastra.eu/content/1ee3b9ba-9504-472b-9543-05f03a3da7a1/visuel-article-25-100.webp",59455]