The Deadline Most Businesses Are Getting Wrong
Ask most SME owners about EU AI Act compliance and they'll say something like "we have until 2027, right?" Wrong. August 2, 2026 is the date that matters for the vast majority of EU businesses — and it's now less than five months away.[1]
The EU AI Act (Regulation 2024/1689) entered into force on August 1, 2024, but it rolls out in phases. Each phase brings new obligations. Here's a plain-English breakdown of what has already happened, what hits in August 2026, and what you need to do right now.
What's Already In Force (February 2025)
The first phase of the EU AI Act was not optional: prohibited AI practices have been banned since February 2, 2025. This covers the most dangerous uses of AI — things that are outlawed entirely, with no compliance path:[1]
- Social scoring systems — rating individuals based on behaviour, circumstances, or personal characteristics
- Emotion recognition in workplaces and educational settings
- Real-time biometric mass surveillance in public spaces (with narrow law enforcement exceptions)
- Subliminal manipulation — AI that influences people's behaviour without their awareness
If your business is using any of these, the time to stop was February 2025. These aren't grey areas — they are prohibited, full stop.
The August 2025 Phase: GPAI Model Rules
From August 2, 2025, rules for General-Purpose AI (GPAI) models — the foundation models powering tools like ChatGPT, Claude, and Gemini — came into force. These rules primarily target the providers of GPAI models (OpenAI, Google, Anthropic), requiring them to publish technical documentation, comply with copyright law, and implement safety testing for high-capability models.
What this means for you as a user of GPAI tools: relatively little directly. But it does mean the AI providers you rely on are now accountable for their models' safety, which provides some downstream protection.[2]
August 2, 2026: The Big One
This is the deadline that affects most businesses using AI. From August 2, 2026:
- Transparency obligations apply to customer-facing AI — chatbots must identify themselves as AI, deepfake content must be labelled, AI-generated text in certain contexts must be disclosed.
- High-risk AI system obligations kick in fully — if you use AI in HR decisions, credit scoring, educational assessments, or critical infrastructure, you must meet conformity requirements.
- AI literacy requirements — employers must ensure staff working with AI have sufficient understanding of how it works and its limitations.
These are not aspirational guidelines. They are legal requirements backed by significant financial penalties.[3]
Common Misconceptions
Two myths come up constantly when talking to SME owners:
"We're a small company, so this doesn't apply to us." It does. The EU AI Act applies to any business operating in the EU or offering AI-enabled products/services to EU residents. There's no SME exemption — though enforcement is expected to focus on higher-risk, larger deployments first.
"We just use off-the-shelf AI tools — we didn't build anything." This doesn't get you off the hook. The EU AI Act distinguishes between AI providers (who build systems) and deployers (who use them). Deployers have real obligations too, particularly for high-risk systems. If you use an AI-powered CV screening tool and it's high-risk, you're responsible for using it correctly.
What To Do Before August 2, 2026
The good news: for most SMEs, the path to compliance is manageable. Here's where to start:
- Take stock of your AI systems — list every tool that uses AI, from your customer support chatbot to your HR platform.
- Classify the risk level of each system — prohibited, high-risk, limited risk, or minimal risk. Our free audit tool can help you do this in under 10 minutes.
- Add transparency notices to any customer-facing AI — a simple "This conversation is handled by an AI assistant" message in your chatbot is often all it takes for limited-risk systems.
- Document what you do — write down what each AI system does, what data it uses, and who in your organisation is responsible for it.
- Brief your team — Article 4 requires AI literacy. This doesn't mean technical training; it means employees understand what the AI does and when to question it.
Five months sounds like plenty of time. It isn't, especially if you discover you're using a high-risk system that needs a conformity assessment. Start now.
Sources
Know your EU AI Act risk level in 10 minutes
Our free audit classifies every AI system you use and tells you exactly what to do before August 2, 2026.
Start Free Audit →