DORA mit lokaler KI unterstĂŒtzen - CXO Partners GmbH

DORA supported with local AI

CXO GeschÀftsprozesse

Learn how secure artificial intelligence on-premise and the CLUE AI process platform can be used to automatically monitor relevant web sources, process findings transparently, and document approvals in compliance with DORA – all while maintaining full GDPR security.

What is DORA?
In short: DORA stands for Digital Operational Resilience Act – EU Regulation (EU) 2022/2554 on digital resilience in the financial sector . It obliges financial institutions and supervised ICT service providers to implement robust ICT risk management , incident reporting , resilience tests (including TLPT), third-party/cloud controls, and information exchange. DORA has been applicable EU-wide since January 17, 2025 .

In our current guide, we highlight topics and solutions related to DORA-compliant on-premise AI for service providers.

Why “Secure AI” is especially important for service providers

For service companies with DORA-related services (e.g., IT services for financial clients, compliance-related consulting), “AI secure” means on-premise operation in their own data center with clear roles and approvals (four-eyes principle) and technical and organizational requirements to comply with the GDPR .
A complete audit trail and strict data minimization support the expansion of ICT resilience through transparent monitoring, logging and traceability.

AI can significantly reduce resource requirements in DORA and GDPR-related matters:

  • Web source monitoring: industry/regulatory websites, supervisory authorities, supplier security notes, trade media.
  • Documented processing: Automated processing (RAG), structured justification, versioning, history recording, approvals.
  • LLM optimization: Guardrails, prompt governance, domain adapters – all operated locally.

What are the legal requirements for companies in the European Union?

Legal framework 2025/26 in brief: DORA, AI Act & GDPR

The legal requirements for the safe use of AI are comprehensively regulated in the EU:

DORA (EU 2022/2554) sets the standard for digital resilience in the financial sector; requirements apply from 17 January 2025. [1][3] The AI ​​Act has been in force since 1 August 2024 ; prohibited practices apply from 2 February 2025 , and obligations for general-purpose AI will follow in stages (including from 2 August 2025 ). [4] For the GDPR , Articles 6 (Legal Bases) and 35 (DPIA/DPIA) remain central – especially regarding the systematic monitoring of publicly accessible sources or sensitive data. [5][6]

A good approach includes:

  • Choose a legal basis: Article 6 GDPR, such as legal obligation/public interest or legitimate interest with balancing of interests. [5]
  • Consider DSFA (Article 35): If the scale, sensitivity, or systematic monitoring suggests it – including documenting risk measures. [6]
  • Establish AI Act guardrails company-wide: Avoid prohibited practices, such as biometric categorization of sensitive characteristics and indiscriminate web scraping for resale. [7]

The BaFin information page on DORA documentation requirements is also a helpful resource:

What could AI support look like?

From web monitoring to documented processing – on-premise with CLUE

The basis for secure implementation is the digitalization of the entire process. Humans, as decision-makers, remain crucial for approvals and confirmations.

Here's how it works in practice: The solution is operated in the company's own data center, with EU cloud bursting available for peak loads if needed. CLUE handles the orchestration: The AI ​​platform manages crawlers for defined web sources (e.g., regulatory authorities, CERT reports, supplier advisorys, trade publications), performs LLM analyses locally, and combines both using Retrieval-Augmented Generation (RAG).

This process results in auditable dossiers with source references, key findings, and recommendations for action. Every processing step is versioned (who changed what and when), logged (full logging), and secured via a role and rights concept (e.g., creator → reviewer → approver using the four-eyes principle).

The result: significantly less manual research, consistent fast processing and quality-assured approvals that reliably meet audit requirements.

A good solution takes the following into account:

  1. Monitoring sources: government websites, CERT/CSIRT, regulatory authorities, supplier advisory services, trade press.
  2. LLM optimization: Domain prompts, policies, red team checks, hallucination filter (RAG), citation requirements.
  3. Processing & Approval: Evidence collection, evaluation (impact, relevance), approval workflow with four-eyes step.
  4. Audit trail & export: RFC-compliant logs, SIEM/ITSM handover, audit-proof storage (periodically, e.g., every 24 months).

What time-saving advantages does the described solution offer in the process flow?

Optimized processes: this is how to achieve sustainability

Expected improvements based on lead times:

  • Up to 60% time savings between monitoring and completed review.
  • Up to 50% time savings to audit readiness through process-accompanying documentation,
  • Up to 40% fewer errors in monitoring through LLM pre-filter + RAG citation,
  • and 100% available , traceable documentation (versions, releases)

What other topics related to DORA should be considered?

FAQ for top topics about DORA

Keep the following questions in mind to competently meet the requirements:

Cloud vs. On-Prem for DORA?
For DORA-related services, on-premises solutions offer clear advantages (control, traceability, data sovereignty). EU cloud services can serve as an option for non-sensitive peak loads – governance and logging remain mandatory. [1][3]

Web scraping & source citation?
Use only legitimate sources, observe terms of use, save evidence (URL, date, excerpt) and cite in the dossier. For personal data: check the legal basis, consider a data protection impact assessment. [5][6]

DPIA/DSFA for AI processes – when?
If the nature, scope or purposes are likely to result in a high risk (e.g., systematic monitoring), a DPIA is required. [6]

Roles & approvals (four-eyes principle)?
Separate creation, review, and approval. Document every decision with justification and version – this facilitates DORA and GDPR compliance. [1][3][5]

Note: In Austria , Section 120 of the Criminal Code (StGB) applies to audio recordings – misuse of sound recording/listening devices . Secret recordings and unauthorized distribution can be punishable offenses – therefore, always obtain consent before recording. [8]

Conclusion

Secure on-premises AI is the most efficient way to practically meet DORA requirements while complying with GDPR and the AI ​​Act. Intelligent AI-powered solutions like CLUE consolidate web monitoring, documented processing, and approvals into an auditable process. This shifts work away from repetitive research and correction toward value-creating decisions – resulting in measurable efficiency gains.

Questions? We're happy to help! Call us at +43 1 997 28 34 or use our contact form below.
Together we will find the solution that has the greatest impact on your digital process.

Further reading recommendations

Sources

We work transparently: these are some of the sources we used for our blog post, and we used our intelligent CLUE content assistant for structuring it:

  1. EUR-Lex: Regulation (EU) 2022/2554 – DORA
  2. WKO: Digital Operational Resilience Act (DORA) – dates/information
  3. BaFin: DORA – Overview & Level 2/3
  4. RTR: Timeline of the AI ​​Act (Entry into force & stages)
  5. EUR-Lex: GDPR (Regulation (EU) 2016/679) – Art. 6, Art. 35
  6. Article 35 GDPR (German version)
  7. WKO: AI Act overview (prohibited practices)
  8. RIS: Section 120 of the Criminal Code – Misuse of sound recording/listening devices
Back to blog

We break new ground. Are you joining us?

  • SouverĂ€ne Daten & vertrauenswĂŒrdige KI einsetzen - CXO Partners GmbH

    Sovereign data & trustworthy AI

    CXO GeschÀftsprozesse

    Learn details about the secure operation of AI, as well as transparency and data sovereignty in handling company data.

    Sovereign data & trustworthy AI

    CXO GeschÀftsprozesse

    Learn details about the secure operation of AI, as well as transparency and data sovereignty in handling company data.

  • DORA mit lokaler KI unterstĂŒtzen - CXO Partners GmbH

    DORA supported with local AI

    CXO GeschÀftsprozesse

    Follow our guide to securely monitor web sources, ensure auditable traceability, and comply with GDPR & EU-AI-Act.

    DORA supported with local AI

    CXO GeschÀftsprozesse

    Follow our guide to securely monitor web sources, ensure auditable traceability, and comply with GDPR & EU-AI-Act.

  • KI braucht Kontext fĂŒr gute Ergebnisse - CXO Partners GmbH

    AI needs context

    CXO GeschÀftsprozesse

    💡 Understand how a specific application context leads to better AI results. This is how AI brings value to your business.

    AI needs context

    CXO GeschÀftsprozesse

    💡 Understand how a specific application context leads to better AI results. This is how AI brings value to your business.

1 of 3