UK-wide

The EU AI Act's high-risk AI system obligations take effect on 2 August 2026. If your business develops, deploys, or provides AI systems in the EU that fall within a high-risk category, you must be compliant by that date.

Compliance is not something you can achieve overnight. Conformity assessments, technical documentation, and quality management systems take months to prepare. Businesses that start now will be well positioned; those that wait risk disruption to their EU market access.

This guide sets out the practical steps to prepare, in the order you should tackle them.

Which AI systems are high-risk?

Before you can prepare, you need to know which of your AI systems fall within the high-risk classification. The Act defines specific categories — review your entire AI portfolio against this list.

Pay particular attention to AI used in recruitment and employment (CV screening, performance monitoring, interview assessment), creditworthiness (loan decisions, insurance pricing), and access to essential services (healthcare triage, benefits eligibility). These are the categories most likely to affect UK businesses operating in the EU.

If none of your AI systems fall within a high-risk category, you do not need to prepare for these obligations — but you should document your classification analysis in case a regulator asks.

Preparation steps

Work through these steps for each AI system you have classified as high-risk. Allow at least 6 to 12 months for full preparation, depending on the complexity of your systems and how mature your existing governance processes are.

  1. Classify your AI systems against EU AI Act risk categories

    Create a complete inventory of every AI system your business develops, deploys, or provides. For each system, determine which risk tier applies (prohibited, high-risk, limited-risk, or minimal-risk). Document the rationale for each classification decision. If a system sits close to the boundary between tiers, take the more cautious classification.

  2. Identify which systems are high-risk

    Cross-reference your AI inventory against the specific high-risk categories: employment and recruitment, creditworthiness, education, essential services, law enforcement, migration, and judicial proceedings. Include AI embedded in products covered by EU product safety legislation (medical devices, machinery, toys, vehicles).

  3. Conduct a conformity assessment for each high-risk system

    Determine which conformity assessment route applies. Some high-risk categories allow self-assessment; others (such as biometric identification and critical infrastructure) require third-party assessment by a notified body. Begin the assessment process early — notified body capacity may be limited as the deadline approaches.

  4. Prepare technical documentation

    For each high-risk system, create and maintain detailed technical documentation covering: system design and architecture, training data and methodology, testing and validation results, performance metrics, known limitations, and instructions for use. This documentation must be kept up to date throughout the system's lifecycle.

  5. Implement a quality management system

    Establish a quality management system that covers: risk management processes, data governance procedures, design and development controls, post-market monitoring, incident reporting, and communication with competent authorities. If you already hold ISO 9001 or similar certifications, map your existing processes against the EU AI Act requirements and fill any gaps.

  6. Set up post-market monitoring

    Design and implement a system to actively monitor how your high-risk AI performs after deployment. Track accuracy, fairness metrics, and adverse outcomes. Establish processes for responding to performance degradation, including pausing or withdrawing the system if necessary. Report serious incidents to the relevant market surveillance authority.

  7. Appoint an authorised representative in the EU if needed

    UK businesses that are not established in the EU must appoint an authorised representative based in an EU member state before placing a high-risk AI system on the EU market. The representative acts as your compliance contact for national authorities. Choose a representative with experience in AI or product regulation.

Timeline context

Understanding the full implementation timeline helps you prioritise your preparation activities. Some obligations are already in force; others are approaching.

The high-risk deadline of 2 August 2026 is the most significant for most UK businesses. However, if you also provide general-purpose AI models in the EU, note that those obligations have been in force since August 2025.

The final phase in August 2027 covers AI embedded in products regulated under existing EU product safety legislation. If your AI is embedded in machinery, medical devices, toys, or vehicles, you have slightly more time — but the conformity assessment requirements for these products are typically more demanding.

Penalties for non-compliance

The financial consequences of missing the deadline are substantial.

⚠️ Penalties are calculated on global turnover

EU AI Act fines are based on worldwide annual turnover, not just EU revenue. A UK business with significant global turnover but modest EU sales could still face very large fines. For high-risk non-compliance, the maximum is 15 million EUR or 3% of global turnover — whichever is higher. SMEs and startups benefit from reduced caps, but even these can be material. Non-compliance also risks losing EU market access entirely, as authorities can order withdrawal of non-compliant AI systems.

What to do now

If you have not started preparing, begin with the AI system inventory and classification exercise. This is the foundation for everything else — you cannot prepare for compliance until you know which of your systems are in scope.

If you have already classified your systems, focus on the conformity assessment route and technical documentation. These are the most time-consuming elements and the most common cause of delays.

Consider engaging specialist legal and technical advisers with EU AI Act experience, particularly for third-party conformity assessments where notified body capacity may become constrained as the August 2026 deadline approaches.