Edition: February 2026
Note: As of February 2026, the delay in EU Article 6 guidance, combined with the FCA's new focus on Agentic AI via the Mills Review, suggests that firms should prioritise functional resilience over rigid classification. UK wealth managers should monitor the FCA's expected Summer 2026 report on Agentic AI to inform their governance approach, whilst separately ensuring EU AI Act compliance for cross border operations where applicable.
This diagnostic identifies where your firm has regulatory accountability for AI use without the infrastructure to discharge it.
Chief Executive Officer, Chief Operating Officer, Head of Compliance, Chief Risk Officer, Chief Technology Officer in UK wealth management firms.
5 minutes for initial assessment
Gaps between your regulatory obligations under SM&CR, Consumer Duty, EU AI Act, and DORA and current capabilities
Specific exposure points where lack of oversight creates regulatory risk
Priority areas requiring immediate action versus longer term infrastructure build
Read each question carefully
Score honestly: Yes (1 point), Partial (0.5 points), No (0 points)
Calculate section and total scores
Use the interpretation guide to determine action priorities
Share results with your Senior Management Function holders under SM&CR
Your firm uses AI daily in platforms like FNZ, Avaloq, portfolio management systems, and client communication tools. Platform providers train staff to use AI features but cannot provide governance frameworks due to conflicts of interest. That responsibility falls to you. This assessment reveals where you're exposed.
Under SM&CR, specific Senior Managers must be accountable for AI governance oversight. The FCA expects demonstrable oversight capability.
Consumer Duty requires firms to act to deliver good outcomes for retail customers. AI systems that influence advice, suitability, or customer service fall within scope.
The FCA expects firms using AI to maintain governance, ensure explainability, manage bias, and maintain human oversight.
The EU AI Act applies extraterritorially to UK wealth managers under Article 2 if:
You are a deployer with operations in the EU
Your firm has establishments in EU member states
You deploy AI systems within the EU
The outputs of AI systems are used in the EU
You serve EU resident customers
AI generated advice, suitability assessments, or portfolio recommendations are used by persons located in the EU
Client communications or reports produced by AI are delivered to EU customers
This applies regardless of: Where the AI provider is located, where the AI system was developed, or where the data was processed. What matters is where the AI output is used, not where the vendor is based.
Credit scoring, suitability assessment, fraud detection, and customer profiling are classified as high risk under the Act. Requirements include:
Risk assessment and mitigation
Data governance and quality standards
Technical documentation and record keeping
Transparency and provision of information to users
Human oversight
Accuracy, robustness, and cybersecurity standards
If your firm procures AI systems, you are a "deployer" with obligations to:
Verify AI systems meet EU AI Act requirements
Monitor AI system performance
Maintain use logs
Report serious incidents
Conduct fundamental rights impact assessments for high risk systems
If platform providers fail to meet EU AI Act obligations, your liability as deployer is not eliminated.
Non compliance fines up to €35 million or 7% of global annual turnover, whichever is greater.
February 2025: Prohibited AI practices banned
August 2025: General purpose AI obligations begin
August 2026: High risk AI system obligations fully applicable
August 2027: High risk systems in regulated products
DORA applies to UK wealth managers with EU operations or using EU based ICT providers. Requirements include:
ICT risk management framework
Incident reporting for major ICT related incidents
Digital operational resilience testing
Third party ICT service provider risk management
Contractual provisions with ICT providers
AI platforms and tools are ICT systems under DORA. Wealth managers must:
Assess ICT risk in AI vendor relationships
Include DORA compliant clauses in AI platform contracts
Test operational resilience of AI systems
Report major AI related incidents to regulators
If your AI platform providers are designated as critical under DORA, enhanced oversight and contractual terms are mandatory.
UK wealth managers face three overlapping regimes:
UK regulation (SM&CR, Consumer Duty, FCA expectations)
EU AI Act (if outputs used in EU or firm operates in EU)
DORA (if using EU ICT providers or operating in EU)
Your governance infrastructure must address all three. Gaps in any area create regulatory exposure.
Book a free 15 minute consultation with our regulatory compliance experts
Schedule Free Consultation