Professional & Financial Services UK-wide

The Financial Conduct Authority (FCA) does not have AI-specific rules. Instead, it regulates AI through its existing supervisory frameworks: Consumer Duty, the Senior Managers and Certification Regime (SM&CR), and operational resilience requirements. For FCA-regulated firms, this means AI governance is not a separate compliance workstream but an extension of obligations you already have.

This approach has important practical consequences. There is no single FCA rulebook chapter to consult. Instead, you must map AI usage across multiple regulatory frameworks and ensure that each deployment meets the standards expected under whichever framework applies. The FCA has made clear through speeches, Dear CEO letters, and its AI Update (published April 2024) that it expects firms to manage AI risk proactively, and that existing rules are sufficient to hold firms accountable for AI failures.

Why financial services AI is different

Financial services firms face a more demanding regulatory environment for AI than most other sectors. Several factors make this the case:

  • Consumer Duty (July 2023): Requires firms to deliver good outcomes for retail customers. If an AI model produces poor outcomes, even unintentionally, the firm is in breach
  • SM&CR accountability: A named senior manager must be accountable for AI governance. There is no hiding behind the technology
  • Prudential requirements: AI models used for credit decisions, capital calculations, or risk management engage PRA expectations on model risk
  • FOS complaints: Customers harmed by AI decisions can complain to the Financial Ombudsman Service, which can award compensation

The FCA's position, articulated by its Chief Data, Information and Intelligence Officer, is that firms should be innovating with AI but must do so within robust governance frameworks. The regulator is not anti-AI; it is anti-ungoverned AI.

FCA AI governance frameworks

The FCA expects firms to govern AI through three interconnected frameworks. Each imposes specific requirements on how AI is developed, deployed, monitored, and retired.

Data protection for AI in financial services

Financial services firms process large volumes of personal data through AI systems, including credit histories, transaction data, identity documents, and behavioural patterns. UK GDPR applies to all of this processing, and the ICO has specific expectations for AI in financial services, particularly around automated credit decisions and profiling.

Transparency and explainability

Transparency is a recurring theme across FCA, ICO, and PRA expectations. For financial services firms, the transparency obligation operates at multiple levels: explaining to customers how AI affects them, explaining to regulators how AI models work, and explaining to internal governance bodies how models are performing.

The Mills Review and future regulation

In November 2024, the government commissioned Dame Elizabeth Mills to conduct an independent review of AI regulation in financial services. The Mills Review is expected to report in 2026 and may recommend sector-specific AI rules that go beyond the current framework-based approach.

While the review is ongoing, the FCA has signalled that it will not wait for new legislation before taking enforcement action on AI. Firms should not treat the Mills Review as a reason to delay AI governance. The current frameworks, Consumer Duty, SM&CR, and operational resilience, already provide the FCA with sufficient powers to supervise AI use.

Preparing for the Mills Review

Firms that have robust AI governance in place now will be better positioned to adapt to whatever the Mills Review recommends. The review is likely to focus on:

  • Model risk management standards for AI
  • Explainability requirements for customer-facing AI
  • Third-party AI model oversight (including foundation models)
  • Cross-regulator coordination between FCA, PRA, and ICO

Practical steps for FCA-regulated firms

These actions address specific FCA expectations. They are ordered by priority, starting with governance accountability and moving through to ongoing monitoring.