The Ambiguity That Made Delay Look Reasonable Is Fading
For the past few months, the dominant compliance strategy for SMEs has essentially been: wait. Wait for the AI Omnibus. Wait for trilogue to conclude. Wait for clarity before investing in conformity assessments or documentation systems. It felt rational. The August 2026 deadline was under active negotiation. Why spend money preparing for a law that might change?
But something has shifted — and it happened before the final Omnibus text was agreed. A new analysis from Corporate Compliance Insights published this week traces the change: the European Commission's original approach to the AI Act was conditional — compliance would kick in when harmonised standards were published. That flexibility is now gone.[1]
What Replaced 'When Standards Are Ready'
The Commission originally proposed linking high-risk AI compliance obligations to the publication of harmonised CEN-CENELEC standards. In theory, this was sensible: companies couldn't know if their documentation met the bar until the standards defined that bar. So the deadline would trigger when standards arrived.
In practice, it created a dangerous loophole. If standards were delayed, compliance obligations were automatically delayed. The law became contingent on something outside Parliament's direct control.
Both Parliament and Council have now independently arrived at fixed application dates: December 2, 2027 for standalone high-risk AI systems, and August 2, 2028 for AI embedded in regulated products. These dates don't depend on when standards arrive — they are absolute.[1]
That distinction matters enormously. A conditional deadline that triggers when standards land is a reason to watch and wait. A fixed calendar date is not. You now know when the obligations arrive regardless of whether technical guidance is ready. That changes the risk calculation.
The Registration Obligation That Came Back
One more concrete change worth noting: the Commission proposed removing the obligation for AI providers to register their systems in the EU database if they self-assessed those systems as non-high-risk. Both Parliament and Council have reinstated the registration requirement, while simplifying the information needed.[1]
This matters for any SME building AI products. If you are putting an AI system on the EU market — even one you've assessed as low-risk — you will still need to register it. The burden is lower, but the obligation survived.
The Part That Isn't Changing: Your Transparency Obligations
The Omnibus negotiations haven't touched the transparency obligations landing August 2, 2026. Chatbot disclosures, AI content labelling, and the Article 50 requirements for high-risk systems remain the confirmed, unchanged deadline.[3]
If you've been using the Omnibus as an excuse to defer your transparency compliance — the chatbot disclosure you haven't added, the AI-generated image label you haven't implemented — that excuse no longer holds. These obligations are not part of the amendment package.
The Bigger Concern: Civil Society Is Watching Closely
Amnesty International published an analysis this week warning that the Omnibus represents an unprecedented rollback of digital rights protections at EU level.[2] The organisation argues that the changes benefit large corporate interests over individuals, and that the framing of 'simplification' obscures genuine weakening of protections — particularly around high-risk AI systems.
Whether or not you agree with that framing, it signals something compliance teams should take seriously: the Omnibus is politically contested. The final text could yet be different from what is currently expected. Businesses that treat December 2027 as a certainty are making an assumption that hasn't been legally confirmed.
What SMEs Should Do Right Now
Three steps that make sense regardless of how trilogue concludes:
- Handle transparency now. Chatbot disclosures and AI content labelling are not part of the Omnibus. August 2, 2026 is the confirmed deadline. Add that disclosure to your chatbot today if you haven't. It's a five-minute task.
- Build your AI inventory. Knowing exactly which AI systems your company uses — and how — is the foundation of everything else. It's useful whether the August 2026 deadline applies or the December 2027 date is confirmed.
- Track the April 28 trilogue target. Both Parliament and Council are aiming for a political agreement by the end of April. If a deal lands on schedule, the legal process to formalise the delay accelerates significantly. You'll want to know quickly.
The 'wait and see' window was always a risk management choice, not a compliance strategy. That window is narrowing — not because the law changed, but because the nature of the law changed. Fixed deadlines don't wait for certainty. Neither should your compliance programme.
This article is for informational purposes only and does not constitute legal advice.
Sources
Know your EU AI Act risk level in 10 minutes
Our free audit classifies every AI system you use and tells you exactly what to do before August 2, 2026.
Start Free Audit →