← Zurück zum Blog
Fines & Enforcement
5 min read12 March 2026

The €35M Fine Is Real — But That's Not Your Real Risk

The headline fine number is real but misleading. Here's an honest look at what enforcement actually means for SMEs — and why reputational risk is probably a bigger deal than fines.

About Those Numbers

€35 million. That's the headline fine for the most serious EU AI Act violations — using prohibited AI like social scoring or mass biometric surveillance. It's real. It's also the maximum, not the floor. The law explicitly requires fines to be "effective, proportionate and dissuasive." For a 25-person SME, a €35M fine would fail any proportionality test. Regulators know this.[1]

The tiered penalty structure goes: up to €35M or 7% of global turnover for banned AI practices, up to €15M or 3% for high-risk AI violations, up to €7.5M or 1.5% for providing misleading information to authorities. Again, these are ceilings. The regulation specifically says authorities must account for size and economic resources when setting penalties.

Who Actually Enforces This

National Competent Authorities (NCAs) are the primary enforcers — each EU member state is setting up its own authority to handle AI Act compliance. Many are still being built. As of early 2026, only 8 of 27 EU countries had formally designated their NCA. Enforcement capacity will be uneven in the first year or two.[2]

The European AI Office operates at EU level and handles oversight of the big foundation models — OpenAI, Google, Anthropic, Meta. That's where most of the Office's attention is going right now. For the next 12–18 months, SME enforcement will mainly be a national story, not a Brussels story.

The Honest Enforcement Picture

Regulators build precedent by pursuing the most egregious cases first. That means large organisations using outright banned AI at scale. Or high-risk AI with zero documentation, zero oversight, zero effort at compliance. A 30-person company that added "I'm an AI assistant" to its chatbot and wrote down what each tool does is not going to be enforcement target number one.

But "enforcement hasn't started yet" is not a strategy. When GDPR launched in 2018, the same thinking was widespread. Two years later, fines were flowing — and the early penalty targets were often companies that had done nothing at all. The AI Act trajectory looks similar. The window between "law in force" and "active enforcement" exists, but it closes.

The Risk That Actually Matters for SMEs Right Now

Here's the more interesting story: for most small businesses, reputational risk is bigger than fine risk. Imagine a customer finds out your company is using AI to screen job applications without any disclosure. They post about it. A journalist covers it. You're now "the company that used secret AI hiring tools" — which, as of August 2026, is also illegal. The reputational damage from that story could cost far more than any proportionate regulatory fine.

Transparency disclosures aren't just about compliance. They're trust signals. The businesses that handle this well — openly, clearly, before customers ask — are building something. The ones that get caught out are not.

What This Means in Practice

If you use minimal or limited risk AI — chatbots, writing assistants, recommendation engines — your enforcement exposure is genuinely low right now. Add the disclosures, do basic documentation, brief your team. That's it.

If you use high-risk AI in HR, credit decisions, or education, your exposure is material. That's precisely the category the regulation was designed for. Don't count on flying under the radar. Start the compliance work now — the proposed 2027 extension for high-risk rules may help, but it's not final.

This article is for informational purposes only and is not legal advice.

Kennen Sie Ihr EU KI-Risikoniveau in 10 Minuten

Unser kostenloses Audit führt Sie durch die genauen Fragen zur Klassifizierung Ihrer KI-Systeme und zeigt, was Sie vor dem 2. August 2026 tun müssen.

Kostenloses Audit starten →

⚠️ Keine Rechtsberatung — nur zur Orientierung