← Back to Blog
Policy Updates
4 min read28 April 2026

The AI Omnibus Is Adding a 'Nudifier' Prohibition and SME Relief. Today Is Decision Day.

As trilogue negotiations conclude on April 28, EU lawmakers are set to prohibit AI 'nudifier' applications outright — and extend regulatory relief to mid-sized companies with up to 750 employees. Here is what SMEs need to know about both measures before the final text is locked in.

Two Significant Additions to the AI Act — Agreed Before the Ink Is Dry

While most of the AI Omnibus coverage has focused on the deadline extension for high-risk AI systems — pushing the August 2, 2026 Annex III deadline to December 2, 2027 — two other measures have cleared the Parliament and Council with less fanfare and are set to become law as part of the final trilogue deal being negotiated today, April 28.[1]

One is a new outright prohibition on AI applications that create non-consensual sexualised imagery. The other is a significant expansion of proportionality relief — the easing of certain AI Act obligations — to cover small mid-cap companies with up to 750 employees and €150 million in annual turnover.[1] Both matter for SMEs, and understanding them before the final text is published means you are ahead of the curve.

The 'Nudifier' Prohibition: What It Actually Covers

Both the Council and Parliament have agreed to introduce a new prohibition under the AI Act covering AI-enabled applications that "alter, manipulate or artificially generate realistic images or videos so as to depict sexually explicit activities or the intimate parts of an identifiable natural person, without that person's consent."[1]

This is the AI Omnibus provision that received the least attention but may have the most immediate practical effect for a wide range of businesses. Here is why the wording matters.

The prohibition does not only cover purpose-built tools. It extends to AI systems more broadly where such misuse is "reasonably foreseeable" based on the system's functionality.[1] That means organisations deploying image generation, video synthesis, or avatar creation tools need to assess whether their system's capabilities could enable non-consensual sexual imagery — and implement guardrails accordingly.

The UK's Ofcom regulator moved on this in January 2026, opening a formal investigation into X related to the use of Grok to create and share non-consensual sexualised images.[1] The EU is now following the same enforcement logic: these applications cause serious harm, and the regulatory response is to prohibit them at the source rather than chase them after the fact.

For SMEs: if your product generates images or video — AI avatars, synthetic media tools, image editing, video creation — this prohibition is now in scope regardless of your stated use case. You need to demonstrate you have assessed whether your system could foreseeably be misused for non-consensual intimate imagery, and that you have implemented proportionate technical and organisational measures to prevent that misuse.

SME Proportionality Relief: Who Qualifies and What It Means

The second significant measure is the extension of the EU AI Act's existing SME proportionality framework to cover small mid-cap enterprises, or SMCs.[1]

Under current AI Act rules, SMEs already benefit from reduced documentation requirements, access to regulatory sandboxes free of charge, and lower conformity assessment fees. The Omnibus extends these provisions to companies with up to 750 employees and annual turnover up to €150 million — companies that fall between the classic SME definition and large enterprise.[1]

The practical benefit is not exemption from the underlying obligations. The SMC definition matters because it determines which fee structures, documentation simplification options, and sandbox access terms apply to your company. If you have been planning your compliance costs assuming large-enterprise documentation requirements, and you qualify as an SMC under the new definition, those assumptions may be overestimated.

For SMEs: check whether your headcount and turnover put you close to the SMC thresholds. If you are a growing company that has been scaling past classic SME definitions but still well below large-enterprise size, the Omnibus may bring you into scope for regulatory relief you were previously not eligible for.

The Open Questions Trilogue Has Not Resolved Yet

Despite the convergence on these two measures, the Ropes & Gray analysis identifies three areas where the Parliament and Council remain genuinely divided heading into today's final trilogue:[1]

  • Defining what is a safety function. The Parliament has proposed that AI features intended solely for user assistance, performance optimisation, service efficiency, automation, or convenience should not be treated as safety functions — unless their failure would create actual safety risks. The practical effect of this narrower definition would be that fewer AI systems qualify as safety components of regulated products and are therefore classified as high-risk. The Council has not proposed an equivalent limitation.
  • AI literacy obligations. The Council wants to replace the existing AI Act requirement with a non-binding encouragement model. The Parliament retains a binding duty on providers and deployers to support AI literacy among their staff. This affects your training obligations.
  • Cyber Resilience Act mutual recognition. The Parliament proposed that AI systems meeting the Cyber Resilience Act's essential cybersecurity requirements should be presumed compliant with the AI Act's Article 15 robustness requirements. The Council did not include this provision. If your product is already CRA-compliant, this mutual recognition would simplify your AI Act documentation significantly.

What Today's Trilogue Outcome Means for Your Compliance

The deadline extension to December 2, 2027 for Annex III stand-alone high-risk systems and August 2, 2028 for Annex II embedded systems has not yet been published in the Official Journal.[3] Until it does, August 2, 2026 remains your legal deadline for Annex III obligations. Prudent compliance planning treats that date as binding regardless of what the trilogue produces today.

Here is what to watch for in the hours after the trilogue concludes:

  • Did the institutions reach political agreement? A confirmed deal means the legislative text is effectively finalised. If today's trilogue concludes without agreement, the process extends — and August 2, 2026 remains fully in force.
  • What does the final text say about the SMC definition? The employee and turnover ceilings in the current agreement language come from Commission Recommendation (EU) 2025/1099, not the AI Act itself. The Omnibus may formalise those ceilings or adjust them. Your compliance cost model may need recalibration.
  • Does the final text include the narrower safety-function definition? If it does, fewer AI features qualify as high-risk safety components. That is a meaningful reduction in compliance scope for a range of AI-enabled products.
  • What happened to AI literacy? A non-binding model means training is effectively optional. A binding model means you need a documented AI literacy programme for your staff by the relevant deadline.

The Bottom Line for SMEs

Two things are clear going into today's trilogue: the deadline extension is politically agreed and is coming, and the "nudifier" prohibition and SMC relief are substantive measures that will land in the final text regardless of what else happens.[1]

For SMEs building AI products or deploying AI in consequential contexts: audit your image and video generation capabilities against the "nudifier" prohibition now. Even if you have no intention of producing intimate imagery, the foreseeability test means you need to document that you assessed the risk and implemented guardrails. And review whether the SMC definition applies to you — if it does, the proportionality relief available to you may be broader than you assumed.

We will have a full analysis of the final trilogue outcome as soon as the agreement is confirmed — likely later today.

This article is for informational purposes only and does not constitute legal advice.

Know your EU AI Act risk level in 10 minutes

Our free audit walks you through the exact questions to classify your AI systems and identify what you need to do before August 2, 2026.

Start Free Audit →

⚠️ Not legal advice — for guidance purposes only