Skip to main content
    High-Risk • Annex III

    EU AI Act for Fintech &
    Financial Services

    Credit scoring and insurance AI are high-risk under Annex III. Financial services deploying AI for lending, underwriting, or customer decisions face significant compliance obligations.

    GDPR CompliantSOC 2 Type IIISO 27001

    Credit & Insurance AI is High-Risk

    The EU AI Act explicitly lists AI used to "evaluate the creditworthiness of natural persons" and AI for "risk assessment and pricing in relation to life and health insurance" as high-risk under Annex III.

    Fintech AI Use Cases

    Credit Scoring & Lending Decisions

    High-Risk

    AI systems that evaluate creditworthiness or determine access to credit fall under Annex III high-risk category.

    Key Obligations:

    • Article 26 deployer obligations
    • Human oversight with authority to override
    • Logging for 6+ months
    • Transparency to applicants

    Insurance Risk Assessment

    High-Risk

    AI used for insurance pricing, claims assessment, or underwriting decisions.

    Key Obligations:

    • Risk assessment documentation
    • Non-discrimination monitoring
    • Explainability requirements
    • FRIA if significant impact

    Fraud Detection

    Limited Risk

    AI systems for fraud detection typically don't fall into high-risk unless they significantly impact service access.

    Key Obligations:

    • Transparency if customer-facing
    • Documentation of purpose
    • Monitoring for bias

    Customer Service Chatbots

    Limited Risk

    AI chatbots for customer support require transparency disclosure under Article 50.

    Key Obligations:

    • Disclose AI interaction
    • Mark synthetic content
    • Escalation to human available

    Frequently Asked Questions

    Is credit scoring AI high-risk under the EU AI Act?

    Yes. AI systems used to evaluate creditworthiness or determine access to credit are explicitly listed in Annex III as high-risk, requiring full compliance with deployer obligations under Article 26.

    What about robo-advisors and wealth management AI?

    Robo-advisors providing investment recommendations are typically limited risk unless they make autonomous decisions affecting access to essential services. Transparency obligations apply for AI interactions.

    How does DORA interact with the EU AI Act for fintech?

    DORA (Digital Operational Resilience Act) and the EU AI Act are complementary. DORA focuses on ICT risk management while the AI Act addresses AI-specific risks. Financial services firms need to comply with both.

    When do these obligations apply to fintech companies?

    Most obligations apply from 2 August 2026, but prohibited practices and AI literacy requirements applied from 2 February 2025. Credit scoring and insurance AI should prepare now.

    Prepare Your Fintech for Compliance

    Klarvo helps financial services companies classify, document, and evidence AI compliance.

    No credit card
    14-day trial
    Cancel anytime