← Volver al Blog
Policy Updates
4 min read9 April 2026

Your Employees Don't Need a Law Degree — But They Do Need AI Literacy Training. The EU AI Act Agrees.

As the Digital Omnibus simplifies other parts of the AI Act, lawmakers have drawn a firm line on one obligation: AI literacy training for employees who work with AI systems. Here's why it survived the simplification wave — and what it means for your team.

One AI Act Obligation Survived the Simplification Wave Unscathed

When the European Commission published its Digital Omnibus proposal last November, it included a provision that would have quietly shifted one of the AI Act's workforce obligations away from businesses and onto national governments. Instead of requiring companies to ensure their employees had adequate AI literacy, the Commission proposed reframing it as a policy objective for member states to pursue through public programmes and educational reforms.

The reaction from Parliament and Council was swift and unambiguous: no. Both institutions have rejected this change. AI literacy — the obligation to ensure that people working with AI systems understand how those systems function, what risks they pose, and how to identify problems — remains a direct compliance requirement on organisations.[1]

For SMEs, this is one of the most practically significant outcomes of the Omnibus process that hasn't received much attention. While the big-ticket items — deadline delays, fine thresholds, sandbox timelines — dominate the headlines, the AI literacy provision is the one that will affect how you manage your team, regardless of whether you're building high-risk AI or simply deploying off-the-shelf AI tools in your business.

What Article 4 Actually Requires

Article 4 of the AI Act is titled "AI Literacy" and it is deliberately broad. It requires providers and deployers of AI systems to ensure that their staff and other persons dealing with AI systems on their behalf have a sufficient level of AI literacy.[2]

"AI literacy" is not defined as a technical certification. The AI Act describes it as having the necessary skills and knowledge to understand how AI systems work, their capabilities and limitations, the risks they may pose, and how to monitor and intervene appropriately. In plain terms: your employees who use or manage AI tools need to understand them well enough to spot when something goes wrong — or when the system is producing outputs that shouldn't be relied upon.

This applies whether you are a large enterprise running custom-built AI models or a small business using a third-party AI HR tool to screen CVs. The obligation scales with your organisation, but it does not disappear for smaller companies.

Why Lawmakers Refused to Water It Down

The Commission's original rationale for shifting AI literacy to member states was pragmatic: rather than imposing a one-size-fits-all training requirement on every organisation, let each country develop its own public AI education initiatives and let companies rely on those.

Lawmakers from both the Parliament and Council disagreed. Their concern: public programmes move slowly and are not tailored to specific workplace contexts. An online government course about AI basics does not teach a customer support agent how to handle a chatbot that is giving misleading product information. It does not help an HR manager recognise when an AI screening tool is producing biased shortlists.

The Parliament Think Tank's April 2026 report on sandboxes and regulatory infrastructure reinforces this reasoning — noting that national AI literacy initiatives are useful but cannot substitute for the kind of role-specific, tool-specific training that only employers can provide.[3]

In essence, legislators decided that AI literacy is not just a national education policy issue. It is a workplace safety and governance issue — and therefore an organisational responsibility.

What This Means for SMEs in Practice

If you have employees who work with AI systems — and these days, that is almost every business that uses email, customer service tools, HR software, accounting software, or any other digital platform with AI features — you have an AI literacy obligation under Article 4.

The good news: the AI Act does not prescribe a specific training format or minimum hours. You have flexibility in how you deliver AI literacy. It could be formal training, internal briefings, documented guidance on specific tools, or a combination. What regulators will expect is evidence that your people have been given a sufficient grounding to use AI responsibly and to flag issues when they arise.

For SMEs, this is both a compliance obligation and a practical benefit. Employees who understand what their AI tools can and cannot do are better at using them effectively — and less likely to blindly trust outputs that deserve scrutiny. A customer support agent who knows that an AI chatbot can hallucinate is more likely to catch and correct a confidently wrong answer before it reaches a customer.

What Good AI Literacy Looks Like for a Small Business

You do not need to turn your team into AI researchers. What you do need is practical, role-appropriate understanding. For most SMEs, that means covering a few core areas:

  • What the tool is doing. Employees should understand, in plain terms, what the AI system is trying to do and what data it uses to do it. A CRM with AI lead scoring is making predictions about which leads are most promising — based on past data.
  • What can go wrong. AI systems can produce biased or inaccurate outputs, especially when input data is incomplete or unrepresentative. Employees should know what warning signs to look for — a shortlist that excludes certain demographics, a prediction that contradicts obvious real-world signals.
  • What to do when something looks wrong. This is the part most companies skip. If an employee spots a suspicious AI output, do they know who to report it to? Is there a process for reviewing and correcting it? Documenting this process is part of your compliance evidence.
  • Human oversight. AI should assist decision-making, not replace judgement. Employees need to understand that they are still accountable for decisions — and that blindly following an AI recommendation is not a defence if it turns out to be wrong.

Start Now — Don't Wait for the Omnibus

The AI literacy obligation is not part of the Omnibus amendments. Article 4 applies from August 2, 2026 alongside the transparency obligations — it was never on the delay list. The Omnibus may or may not push your high-risk compliance deadline to December 2027. The AI literacy requirement was never moving.

For SMEs, this is a manageable obligation that you can start addressing right now with minimal cost. A one-hour internal briefing on the AI tools your team uses, with a short written summary of what those tools do and how to spot problems, is a reasonable starting point. Document that the briefing happened and keep a record. That documentation is your compliance evidence if a regulator ever asks.

The EU AI Act is building a compliance culture where organisations are expected to understand the AI systems they use. AI literacy training is one of the foundations of that culture. It is also, frankly, just good business practice.

This article is for informational purposes only and does not constitute legal advice.

Conozca su nivel de riesgo IA en 10 minutos

Nuestra auditoría gratuita le guía a través de las preguntas exactas para clasificar sus sistemas de IA e identificar lo que necesita hacer antes del 2 de agosto de 2026.

Iniciar auditoría gratuita →

⚠️ No es asesoramiento jurídico — solo con fines orientativos