Where to Start When You Don't Know Where to Start
Most SME owners know the EU AI Act is coming. Fewer know what to actually do about it. The regulation is 144 pages long. It was written by lawyers, for lawyers. This checklist isn't.
These are the five things that matter most before August 2, 2026, prioritised by impact. If you do nothing else, do these.[1]
1. Take Stock of Every AI System You Use
Before you can comply, you need to know what you're dealing with. This means listing every tool in your business that uses AI — not just the obvious ones.
Start with the obvious: customer service chatbots, AI-written marketing copy, recommendation engines. Then go deeper: does your HR software use AI to rank CVs? Does your CRM score leads automatically? Does your accounting platform flag anomalies using machine learning?
Most SMEs are surprised by how many AI systems they're using. A typical company of 20-50 people might discover 8–15 AI-powered tools once they look carefully. You can't manage what you haven't measured.
Our free audit tool guides you through this discovery process with questions tailored to your industry. It takes under 10 minutes.
2. Classify Each AI System by Risk Level
Once you have your list, classify each system using the EU AI Act's four categories: Prohibited, High Risk, Limited Risk, and Minimal Risk.
The practical approach:
- Is it used to make or influence decisions about people? (hiring, credit, education, healthcare) → Likely high-risk.
- Does it interact with customers or generate content they'll see? → Likely limited risk. Transparency disclosures required.
- Is it internal productivity AI? (writing, summarisation, scheduling) → Likely minimal risk. No specific obligations, but worth documenting.
Don't guess. The EU AI Act has precise definitions, and getting the classification wrong is the most common compliance mistake.[2]
3. Add Transparency Disclosures for Customer-Facing AI
Article 52 of the EU AI Act requires that when customers interact with an AI system, they must know it's AI. This isn't optional from August 2, 2026. It applies to:
- Chatbots on your website or app
- AI-generated email or SMS communications that could be mistaken for human-written
- Voice assistants or IVR systems using AI
The fix is usually simple. For most chatbots, adding "I'm an AI assistant — how can I help?" to the first message satisfies the requirement. If you're using AI to generate personalised emails, a short disclosure at the bottom ("This email was drafted with AI assistance") typically covers you.
For deepfake-style content — AI-generated video, synthetic voice recordings — the rules are stricter. These must be clearly and prominently labelled as AI-generated.
4. Document Your AI Systems
The EU AI Act requires that you can demonstrate responsible AI use. The way you demonstrate this is through documentation.[3]
For minimal and limited risk systems, documentation doesn't need to be extensive: a 1–2 page summary per system covering:
- What the system does and why you use it
- What data it processes
- Who in your organisation is responsible for it
- What human oversight exists
For high-risk systems, documentation requirements are much more detailed — conformity assessments, technical specifications, training data descriptions. If you have high-risk AI, start this process now; it's the most time-consuming part of compliance.
5. Train Your Team on AI Literacy
Article 4 of the EU AI Act requires organisations to ensure their staff have sufficient AI literacy to work with AI systems appropriately. This was one of the first obligations to apply (from February 2025), and many SMEs have missed it.
The good news: "AI literacy" doesn't mean your team needs to understand how neural networks work. It means employees who use AI tools should:
- Understand what the AI does and what it's for
- Know the limitations and failure modes of the system
- Know when to question AI outputs and when to override them
- Know who to escalate concerns to
A one-hour workshop per tool, documented in writing, is usually sufficient for minimal and limited risk systems. The documentation proves you took this obligation seriously.
The Priority Order
If you're short on time, do them in this order:
- Steps 1 and 2 (inventory + classification) take a day. Do them first — they unlock everything else.
- Step 3 (transparency) is usually a quick win — often fixed in an afternoon.
- Step 5 (AI literacy) can be done in parallel — schedule a team session.
- Step 4 (documentation) takes longer but doesn't need to be perfect. Start it, keep improving it.
You have until August 2, 2026. That's enough time — if you start now.
Sources
Know your EU AI Act risk level in 10 minutes
Our free audit classifies every AI system you use and tells you exactly what to do before August 2, 2026.
Start Free Audit →