AI Agents Cybersecurity Training Insights Let's talk
๐Ÿ‡ช๐Ÿ‡ธ ES ๐Ÿ‡ฌ๐Ÿ‡ง EN CA
EU AI Act high-risk AI delay training still mandatory August 2026
EU AI Act March 26, 2026 6 min read

EU AI Act: High-Risk AI Delayed to 2027, But Article 4 Training Obligation Stays at August 2026

The EU Council agreed on March 13 to delay high-risk AI rules. Headlines called it a relief. But if you read the fine print, the deadline that affects you most today has not moved a single day.

CS
Carlos Salgado CEO & Co-founder ยท Delbion

On March 13, 2026, the Council of the European Union agreed to extend the deadlines for high-risk artificial intelligence systems to comply with the EU AI Act. The news spread across every tech publication. "Relief for businesses," the headlines said. "EU slows down AI Act."

If you are a compliance officer, CTO, or anyone with responsibility for AI in your organisation, you probably saw those headlines and thought: good, I have more time.

I need you to read what comes next carefully.

Because the delay is real, yes. But it does not affect you the way you think. And while everyone is looking at the high-risk headlines, there is one obligation that nobody has postponed and that expires in five months: the mandatory AI training for every single member of your staff.

What the Council agreed on March 13

The Council adopted a position to modify the EU AI Act's implementation timeline. The agreement still needs to go through the European Parliament, which voted its position on March 26, 2026 as part of the Digital Omnibus package. But the political direction is clear and the agreement is in place.

The underlying reason is twofold. First, the European Commission missed its February 2026 deadline to publish technical guidance on high-risk AI systems. Without that guidance, companies have no clear criteria to evaluate whether their AI systems fall within the high-risk category. Second, only 8 of the 27 EU member states have designated their national contact points, leaving the supervisory infrastructure largely inoperative.

The Council's conclusion was that applying high-risk obligations in August 2026, without official guidance and without a supervisory infrastructure in most member states, would be applying a rule in a vacuum.

The Council agreement, summarised

The Council agreed on March 13, 2026 to extend the compliance deadline for high-risk AI systems. This agreement is part of the Digital Omnibus package the Commission presented in February 2026.

What is delayed and by how long

There are two categories of high-risk systems, and each has a new deadline:

System type Original deadline New agreed deadline
Stand-alone high-risk AI August 2, 2026 December 2, 2027
High-risk AI embedded in products (Annex I) August 2, 2027 August 2, 2028

Stand-alone systems are those not embedded in a physical product: credit scoring systems, recruitment tools, critical infrastructure management AI, educational assessment systems, and so on. These move from August 2026 to December 2027, a delay of 16 months.

Systems embedded in products regulated by Annex I of the AI Act (machinery, medical devices, aviation, motor vehicles, etc.) move from August 2027 to August 2028.

This is a significant delay. But for most businesses, the practical question is not whether their AI systems are high-risk under Annex III classification. The practical question is whether they have trained their staff. And there, nothing has changed.

Article 4 has not changed

Article 4 of the EU AI Act establishes the obligation of AI literacy for all personnel working with AI systems. This obligation entered into force on February 2, 2025. Its effective compliance deadline is August 2, 2026. And it has not been modified.

Not in the Council's March 13 agreement. Not in the Digital Omnibus package. Not in any proposal that has circulated to date.

What Article 4 says, without paraphrase

"Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf."

Regulation (EU) 2024/1689, Article 4.

The article applies to providers and deployers. Providers are those who develop or commercialise AI systems. Deployers are those who use them in their operations. If your company uses any AI tool, you are a deployer.

The obligation makes no distinction by sector, company size, or type of AI used. It is not only for tech companies. It is not only for those with high-risk systems. It applies to any organisation that uses artificial intelligence.

Aug 2, 2026

Article 4 compliance deadline. Unchanged.

Why this happened

The delay on high-risk systems responds to a real problem: the EU AI Act's implementation has moved slower than expected across several dimensions.

The European Commission was due to publish in February 2026 its official guidance on how to classify high-risk AI systems and demonstrate conformity. That guidance has not arrived. Without it, companies that want to comply do not know exactly how to do so. And national regulators would also lack a uniform basis for supervision.

Furthermore, the EU AI Act requires each member state to designate a competent supervisory authority and a national contact point. To date, only 8 of the 27 member states have done so. In a regulatory system that depends on coordinated national supervision, this is a structural problem.

Faced with this situation, the Council preferred to extend the deadline rather than create a scenario where obligations exist on paper but there is no real capacity to enforce them or to guide those who want to comply.

What has not been postponed is what is already operational: Article 4 can be applied today because it does not require complex technical guidance. It requires companies to train their staff. That is something any organisation can do right now.

Penalties for Article 4 non-compliance

Penalties under the EU AI Act for non-compliance with Article 4 fall under the intermediate level of the sanctioning regime: up to EUR 7.5 million or 1% of global annual turnover, whichever is greater.

Sanctioning regime applicable to Article 4

Up to EUR 7,500,000 or 1% of total worldwide annual turnover for the preceding financial year, whichever is greater. For SMEs and start-ups, the penalty is calculated proportionally, but remains material.

For a company with EUR 10 million in revenue, 1% is EUR 100,000. For one with EUR 50 million, EUR 500,000. This is not a symbolic amount.

Beyond the financial figure, there is the reputational impact. The AI Act allows employees, rejected job candidates, or consumer associations to file complaints with the national authority. An investigation by your national regulator into AI literacy compliance at your company is not the kind of news you want associated with your brand.

The penalty is not the most important thing. The most important thing is that August 2026 arrives in five months. And training an entire organisation takes time.

What to do now

The headline about the high-risk delay is real, but it does not give you more time on training. What gives you time is starting to train now.

These are the three things a company should do this week:

1

Audit which AI tools your team uses

Before training, you need to know who uses what. ChatGPT, Copilot, AI features in your ERP, email assistants, scoring systems... Build the inventory. Shadow AI (unauthorised use of AI tools) is more common than it looks and also generates obligations.

2

Plan training by role profile

Article 4 refers to "a sufficient level according to their role." Not everyone in your organisation needs the same course. Technical teams need technical depth. Executives need to understand the regulatory framework and risks. Operations staff need safe and responsible use of AI tools.

3

Document everything

The obligation requires that you can demonstrate you have taken measures. That means certificates per participant, attendance records, and a compliance report that shows what was trained, who attended, and when. Our programmes include all of this.

The EU AI Act is not bureaucracy you can ignore until the last moment. Companies that are trained and documented before August 2026 will have a solid defensive position. Those that are not will face penalties and, more importantly, will be making decisions about AI without the knowledge to make them well.

The high-risk delay gives the Commission time to publish its guidance. It does not give you time to postpone training.

Read the headlines. But read the fine print too. August 2, 2026 is still there.

EU AI Act Training ยท Article 4

Meet Article 4 before August 2026

Our Secure AI Application for Business and EU AI Act Compliance courses are designed specifically to cover the Article 4 obligation. Available in Spanish and English. Includes certificates per participant and a compliance report.

View Training Programmes โ†’

Next step

Article 4 training does not wait for the delay

August 2026 remains the deadline for mandatory AI literacy. Our programmes cover the Article 4 requirements with role-adapted training, certificates per participant and a compliance report. Start now.