Javascript is required
logo-dastralogo-dastra

Building a GPAI? You might be the provider

Building a GPAI? You might be the provider
Leïla Sayssa
Leïla Sayssa
21 July 2025·9 minutes read time

The European Commission released draft guidelines to help general-purpose AI (GPAI) providers comply with the AI Act, particularly their obligations taking effect on August 2, 2025.

Even if not binding, they reflect the Commission's interpretation of the AI Act, and build on the GPAI Code of Practice, recently released.

Is your model a General-purpose AI model? And are you the Provider? Find out here.

When is a model considered a General-purpose AI model?

A General-purpose AI(GPAI) model is defined by Article 3(63) of the AI Act.

However, this definition does not provide a specific set of criteria or conditions that providers can check, which is normal given the inherent nature of those models that are trained on vast datasets using large-scale self-supervised learning, who can perform a large variety of tasks.

The Commission gives a concrete approach: it is the amount of computational resources that are used to train the model measured in FLOP, and the modalities of the model that will define whether a model is a GPAI or not.

A model is likely to be considered a GPAI if:

  • The training compute of the model is greater than 10^23 FLOP. As explained in the Guidelines, the "amount of compute used to train a model is typically proportional to the number obtained by multiplying the number of its parameters with the number of its training examples".
    • For more details on the training compute of GPAI models, refer to the annex of the Guidelines.
  • It is capable of performing a wide range of distinct tasks like generate language (text or audio), text-to-image or text-to-video. The model's training on a broad range of natural language, ability to use language to communicative, store knowledge & reason is an indicator of its significant generality.

If the first threshold is met, but the model cannot perform a wide range of distinct tasks, then it is not a GPAI. For example, if it uses 10^24 FLOP but can only transcribe speech to text, it is not a GPAI because it can perform a narrow set of tasks.

If the model is general enough in its capabilities without meeting the threshold, it is still a GPAI.

While this single threshold seems easier to hold as a criterion, it is not set in stone, as the European Commission indicates that it continues to investigate other criteria.

If GPAI, when is it with systemic risk?

GPAI models that present systemic risks (Article 3(65)), including potential harm to fundamental rights or loss of model control, are subject to more stringent obligations under Articles 52 and 55 of the AI Act.

A model can be classified as such following one of the two conditions:

  • has hight-impact capabilities, namely that "match or exceed those recorded in the most advanced models" (Article 3(64) AI Act. Those high-impact capabilities should have a significant impact on the Union market due to their reach.

    This is presumed when the model’s training compute exceeds 10^25 FLOPs (Article 51(2) AI Act), which can be estimated even at pre-training run.

    • This is not set in stone as the Commission can adjust the threshold to account for advancements.
  • desginated as such ex officio by the Commission or based on alerts of its high-impact capabilities from the scientific panel.

    • The provider can contest by submitting technical justification, such as model architecture, parameter count, training techniques, and dataset details.
    • The provider is subject to obligations only if the Commission decides to reject the arguments, and confirms that the model is indeed with systemic risk.

This triggers enhanced compliance duties such as:

  • Mandatory notification to the EU AI Office: providers must notify the Commission within two weeks once their model meets the criteria or it becomes known that it will. Failure to do so can result in penalties of up to €15 million or 3% of global turnover, whichever is higher (Article 101 AI Act).

  • Continuous risk monitoring and mitigation by "taking appropriate measures along the entire model's lifecycle" (Recital 114 AI Act). The Commission considers the lifecycle to start at the pre-training run.

Ways you could be considered a GPAI provider

Under the AI act, a provider is a person or body that develops a GPAI model or has it developed, and places it on the market under its own name or trademark irrespective of payment.

The definition is intentionally broad and reflects the various ways a GPAI model can enter the Union market.

  • “Placing on the market” refers to the first time a GPAI model is made available in the EU, through any means, be it an API, downloadable library, physical media, cloud computing service, copied onto a customer's own infrastructure. It includes both commercial and non-commercial distributions.
  • Importantly, an entity based outside the EU can still qualify as a provider if it introduces a GPAI model into the EU market, either directly or through intermediaries.
  • The rules concerning GPAI systems, systemic risk included or not, apply even when the models are integrated or form a part of an AI system (Recital 97 AI Act), in addition to those for AI systems.

Here are the ways you could be considered a GPAI provider:

By building or asking someone to build a model for you

If your company develops a model, or commissions another entity to develop it on your behalf, you are the provider.

So it applies whether development is internal or outsourced.

If you make significant modifications to an existing model

If the compute used for your fine-tuning or modification exceeds one-third of the compute used to train the original model, you are likely to become the new provider.

In contrast, small-scale adaptations, such as domain-specific tuning or retrieval-augmented generation (RAG), likely do not make you a provider—they would still be classified as downstream use.

If you release a model under a free & open-source license under certain conditions

Open-source developers are not automatically exempt from provider obligations. If you release a GPAI model under a free and open-source license, you may still be required to comply with all obligations pertaining to GPAI providers, unless those conditions are met:

  1. The model weights and relevant information (such as architecture and usage) are publicly available,

  2. The model is freely accessible and users are able to freely access, use modify and redistribute the model with no monetary compensation required,

  3. The model does not pose systemic risk.

You may be exempt from certain technical requirements, such as providing detailed documentation on training and testing, to both the AI Office and to downstream providers integrating the GPAI into their own systems.

Nevertheless, all providers, including open-source developers, must adopt a copyright policy and publish a summary of their training data sources to ensure transparency and compliance.

Important enforcement dates

  • August 2, 2025 : GPAI-related obligations under the AI Act take effect.

  • August 2, 2027: End of two-year tranistional period given to providers of GPAI models already on the market prior to the applicability date.

  • August 2, 2026: Enforcement powers, including fines, begin

Although penalties cannot be enforced until 2026, the Commission expects early cooperation and voluntary compliance from providers starting in 2025.

The AI Office is committed to helping providers take the necessary steps toward compliance and encourages them to reach out proactively to ensure they are on the right path.

Transitional provisions

If you’ve been developing a GPAI model prior to the Act’s entry into force, and compliance with the AI Act is challenging, the AI Office will provide the necessary support to help comply by August 2027. Notably, retraining or unlearning is not required if it imposes a disproportionate burden, provided this is transparently disclosed and justified in the copyright policy and training data summary.

If you are releasing your first-ever GPAI model, the Commission may take your challenging position into account and grant certain procedural flexibilities.

Code of practice & guidelines: your strategic advantage

  • Adhering to the general-purpose AI code of practice carries a major benefit: under Article 53(4), it creates a presumption of compliance with the AI Act, streamlining your path to legal conformity.

  • The draft guidelines help clarify where you stand in terms of obligations. While the Code gives you a framework, the guidelines help you understand how to apply it to your specific situation.

Get started with the Code of practice right here.

Subscribe to our newsletter

We'll send you occasional emails to keep you informed about our latest news and updates to our solution

* You can unsubscribe at any time using the link provided in each newsletter.