The EU AI Act timeline: important dates for providers, developers, and operators

The EU AI Act came into force in August 2024, triggering a gradual implementation timeline for companies developing, using, or importing AI products. Depending on varied risk levels and supply-chain roles, there are different deadlines to become compliant. Here’s a breakdown of the key dates.  

EU_AI_Act_pillar_hero

Join 4,000+ companies who are driving their security and compliance objectives with DataGuard

Emitec LogoLifeLink LogoVolki LogoMask groupFreenow LogoAuto-Kabel-LogoHeyjobs LogoLebara Logo

The EU AI Act timeline: Who has what deadline to become compliant? 

The EU AI Act has a staggered implementation timeline. Some companies have until 2027 to adapt to its requirements. Others have until 2026, while new AI developers will feel its effects as early as August 2025.  

To know which deadline applies to you, consider three factors: 

  1. Your company’s role in the AI supply chain (are you a provider, a deployer, or another kind of operator)  
  2. Your AI system’s level of risk 
  3. For providers: is your AI system already available on the market, or will it become available after August 2025

What role do you play in the AI supply chain? 

Companies have different sets of responsibilities and deadlines depending on whether they develop, use, or import an AI system. The collective term for everyone on the supply chain is an “operator.” Broken down into more specific roles, operators can be split into: 

AI providers 

AI providers create the AI system. Think of a company developing AI that can automate large parts of a bank’s transaction monitoring and offer predictive analytics.  

AI deployers  

AI deployers use AI systems. Working with the earlier example, a deployer would be the bank or any financial institution that is buying and using the AI.  

AI Importers 

AI importers bring an AI system into the EU under the name or trademark of another company based outside the European Union.  

AI distributors 

AI distributors aren’t providers or importers but still play a role in making AI systems available on the market.

What level of risk does the AI carry with it?

The EU AI Act organizes different types of AI according to a four-tier risk system: 

  1. Minimal risk: AI systems that carry little to no risk of infringing on an individual’s rights, e.g., chatbots or content recommendations.  
  2. Limited risk: This is where General Purpose AI (GPAI) systems sit. Operators in this risk category have lighter transparency requirements if their AI doesn’t have the potential to negatively affect public health, fundamental rights, or society. 
  3. High risk: These AI systems will be under the biggest scrutiny before being allowed on the market. If used improperly, they have the potential to bring significant consequences for individuals. 
  4. Unacceptable risk: As of February 2025, the EU AI Act has banned AI systems that could threaten people’s safety.  

Is the AI already out on the market? 

If an AI system is already available on the market, operators have at least until 2026 to comply with the EU AI Act. Any GPAI products not released before August 2025 will have to follow the new regulation as of August 2025  

The EU AI Act timeline: What are the key dates for your company? 

Considering these different factors, there are three deadlines that might apply to you: 

  • August 2025: For providers whose GPAI system isn’t on the market yet  
  • August 2026: For high-risk AI operators 
  • August 2027: For operators of GPAI already on the market 

Scenario #1: Providers of new GPAI systems 

Key date: August 2025 

General Purpose AI systems released after August 2nd, 2025, must follow the Act’s provisions. 

At the time of writing, the newly established EU AI Office is working on the General-Purpose AI Code of Practice to offer added guidance.  

In addition, as of August 2nd, 2025, Member States should have defined which authorities to report to. If your company is an operator of GPAI already on the market, your deadline will be in 2027. 

Scenario #2: Operators of high-risk AI systems 

Key date: August 2026 

As of August 2026, operators of AI systems that have the potential to bring significant consequences for individuals will need to be compliant with the EU AI Act. 

For example, models used in fields like biometrics, critical infrastructure, education, employment, or public services will come under closer scrutiny.  

This category can seem broad, which is why the European Commission has set out to provide more detailed guidelines and examples until February 2026. 

Scenario #3: GPAI operators whose AI system is already on the market  

Key date: August 2027 

As of August 2nd, 2027, the EU AI Act becomes applicable for everyone. The only exception would be larger IT systems which play a part in areas of freedom, security, and justice within the EU. 

This is also the final deadline for operators of General Purpose AI that was already in use before August 2025.  

Stay ahead of AI regulations—download the ultimate guide to the EU AI Act


Learn how the EU AI Act impacts AI products or services and get some top tips on risk classifications and compliance strategies.

The EU AI Act responsibilities: Who has to do what until their deadline? 

Obligations for providers 

Much like how an AI system’s risk level determines the final compliance deadlines, it also defines what steps providers need to take. 

Providers of GPAI systems are tasked with technical documentation, transparency, human oversight, post-market monitoring, and mitigation of systemic risks.  

For high-risk systems, the obligations become much more expansive, all of which our compliance experts cover in this on-demand webinar. 

Obligations for deployers 

While providers carry tremendous responsibility in keeping AI systems safe, deployers also play a part in their proper implementation and oversight.  

Ultimately, they are still responsible for how the system is used. Among other things, deployers need to keep an eye on data quality, train their staff on how to use the AI correctly, and monitor whether everything is working as intended. 

Obligations for importers and distributors 

Importers and distributors have the task of verifying whether the system they’re placing in the EU market is in fact compliant with the EU AI Act.  

This means they have to confirm if the provider has taken critical steps like carrying out a conformity assessment, making technical documentation available, acquiring the European Conformity marking, and more.  

These operators also have to make sure that compliance isn’t affected by the storage and transport of the AI system. In addition, they’ll need to provide relevant information to the competent authorities and maintain records for ten years.

Reaching EU AI Act compliance: Next steps 

While the EU Commission is working with member states to build out the reporting and monitoring infrastructure of the AI Act until August 2025, companies have several key areas to focus on: establishing a risk management system, implementing data governance measures, drawing up technical documentation, maintaining records, preparing detailed instructions, and more.  

Each roadmap to compliance is highly contextual and will look different from one business to another. Get more familiar with the finer details through our ultimate guide to the EU AI Act. 

 

Discover how you can achieve your security & compliance objectives with DataGuard.