In This Guide
- Introduction
- Scope and Objectives
- Gap Assessment Methodology
- Pillar 1: ICT Risk Management
- Pillar 2: Incident Reporting
- Pillar 3: Resilience Testing
- Pillar 4: Third-Party Risk
- Pillar 5: Information Sharing
- Scoring and Rating Model
- Building the Remediation Plan
- Compiling the Evidence Pack
- Pillar-by-Pillar Evidence Matrix
- Common Gaps and Pitfalls
- Supervisory Engagement Tips
- FAQ
- A structured DORA readiness assessment must cover all five pillars: ICT risk management, incident reporting, resilience testing, third-party risk, and information sharing.
- Use a maturity-based scoring model (1-5) to objectively quantify gaps and benchmark progress over time.
- The evidence pack is the primary artefact for demonstrating compliance to competent authorities during supervisory reviews.
- Remediation should be risk-prioritised, with critical gaps in incident reporting and third-party contracts addressed first given January 2025 enforcement.
- Organisations with existing ISO 27001 or NIS2 frameworks can accelerate readiness by mapping existing controls to DORA requirements.
Introduction
The Digital Operational Resilience Act (DORA) — Regulation (EU) 2022/2554 — establishes a comprehensive framework for ICT risk management across the European Union's financial sector. With enforcement from 17 January 2025, financial entities and their critical ICT third-party service providers must demonstrate compliance across five distinct pillars of operational resilience.
A DORA readiness assessment is the essential first step in that compliance journey. Unlike a simple checklist exercise, a meaningful readiness assessment provides an honest evaluation of your organisation's current posture, identifies gaps that need closing, and produces the evidence artefacts that competent authorities expect to see during supervisory engagement.
This guide walks you through the end-to-end DORA readiness process — from scoping and gap analysis methodology through to evidence pack compilation and supervisory preparation. Whether you are a Chief Information Security Officer (CISO), Head of Operational Resilience, or compliance lead within a financial entity, this guide provides a practical, actionable framework.
This assessment methodology is designed for EU-regulated financial entities including credit institutions, investment firms, insurance undertakings, payment institutions, and electronic money institutions. It is equally applicable to critical ICT third-party service providers (CTPPs) seeking to demonstrate DORA readiness to their financial entity clients.
Step 1: Define Scope and Objectives
Before assessing readiness, you must clearly define what is in scope and what the assessment is intended to achieve. DORA applies at the entity level, but the practical assessment must be scoped to cover all critical and important functions supported by ICT systems.
1.1 Identify In-Scope Entities
If your organisation operates as a group, determine which entities fall within DORA's scope. Article 2 of DORA identifies 21 categories of financial entities, from credit institutions and investment firms to crypto-asset service providers and crowdfunding platforms. Each in-scope entity requires its own assessment, though group-level frameworks can be shared.
- Map all legal entities within your group against the 21 DORA entity categories
- Identify entities that qualify for the simplified ICT risk management framework (Article 16) due to micro-enterprise status
- Document third-country branches and subsidiaries that serve EU clients
- Clarify the proportionality principle as it applies to each entity based on size, risk profile, and complexity of ICT services
1.2 Map Critical and Important Functions
DORA's requirements scale based on the criticality of functions. Your readiness assessment must identify which business functions are "critical or important" under Article 3(22), as these attract more stringent requirements across all five pillars.
- Inventory all business functions and classify them as critical, important, or supporting
- Map ICT systems and services that underpin each critical or important function
- Identify ICT third-party service providers supporting critical or important functions
- Document interdependencies between functions and ICT systems
- Align classification with existing business continuity analysis where available
1.3 Set Assessment Objectives
Define what the assessment should deliver. Typical objectives include:
- Establish a baseline maturity score across all five DORA pillars
- Identify critical gaps that require immediate remediation before enforcement
- Produce a prioritised remediation roadmap with estimated effort and ownership
- Compile an initial evidence pack for supervisory engagement
- Benchmark against industry peers where possible
Step 2: Gap Assessment Methodology
A rigorous gap assessment requires a structured methodology that can be applied consistently across all five pillars. The methodology should produce comparable, quantifiable results that inform remediation prioritisation.
2.1 Assessment Framework Structure
We recommend structuring the assessment into control domains aligned with DORA's five pillars, then decomposing each pillar into specific requirements drawn from the Regulation text, associated Regulatory Technical Standards (RTS), and Implementing Technical Standards (ITS). For each requirement, assess three dimensions:
- Policy and Documentation: Are policies, procedures, and standards in place that address the requirement?
- Implementation and Operation: Are the documented controls actually implemented and functioning in practice?
- Evidence and Monitoring: Can the organisation produce evidence of ongoing compliance, and are monitoring mechanisms in place?
2.2 Data Collection Methods
Effective assessment requires multiple data sources to triangulate findings:
- Document Review: Policies, procedures, risk registers, contracts, audit reports, board minutes, incident logs
- Stakeholder Interviews: CISO, CIO, Head of Operational Risk, Head of Procurement, Business Continuity Manager, Legal Counsel
- Technical Validation: Configuration reviews, architecture diagrams, test results, monitoring tool outputs
- Control Testing: Sampling of key controls to verify effective operation (e.g., incident response tabletop exercises, backup restoration tests)
2.3 Assessment Timeline
A typical DORA readiness assessment follows this timeline:
| Phase | Duration | Activities |
|---|---|---|
| Planning & Scoping | 1-2 weeks | Scope definition, stakeholder identification, document request |
| Document Review | 2-3 weeks | Review policies, procedures, contracts, risk registers |
| Interviews & Validation | 2-3 weeks | Stakeholder interviews, technical validation, control testing |
| Analysis & Scoring | 1-2 weeks | Gap analysis, maturity scoring, findings consolidation |
| Reporting & Roadmap | 1-2 weeks | Readiness report, remediation plan, evidence pack structure |
Step 3: Pillar 1 — ICT Risk Management Assessment
Pillar 1 (Articles 5-16) is the most extensive pillar and forms the foundation of DORA compliance. The assessment must cover the ICT risk management framework, governance, identification, protection, detection, response, recovery, and learning processes.
3.1 Governance and Organisation
Assess whether the management body fulfils its responsibilities under Article 5:
- Does the management body define, approve, and oversee the ICT risk management framework?
- Is there a dedicated ICT risk management function with sufficient authority, independence, and resources?
- Are roles and responsibilities clearly defined for ICT risk management at all organisational levels?
- Does the management body receive regular reporting on ICT risk posture and incident trends?
- Is ICT risk integrated into the entity's overall risk management framework?
3.2 ICT Risk Management Framework
Evaluate the documented framework against Article 6 requirements:
- Is there a documented ICT risk management framework that includes strategies, policies, procedures, and tools?
- Does the framework address all ICT assets, including those managed by third parties?
- Is the framework reviewed at least annually and after significant ICT incidents?
- Is there a formal ICT risk appetite statement approved by the management body?
- Does the framework address the simplified requirements under Article 16 for eligible entities?
3.3 Identification
Article 8 requires comprehensive identification of ICT-supported business functions, assets, and dependencies:
- Is there an up-to-date inventory of ICT assets, including hardware, software, network components, and data?
- Are ICT assets mapped to the business functions they support?
- Are dependencies and interconnections between ICT systems documented?
- Is the inventory updated whenever significant changes occur?
- Are ICT assets supporting critical or important functions specifically identified?
3.4 Protection, Prevention, Detection, Response, and Recovery
Assess controls across Articles 9-12 covering the full lifecycle:
- Protection & Prevention (Art. 9): Access controls, encryption, network segmentation, patch management, secure development practices
- Detection (Art. 10): Anomaly detection, monitoring capabilities, logging and alerting, threat intelligence integration
- Response & Recovery (Art. 11): Incident response procedures, business continuity plans, disaster recovery, backup and restoration testing
- Learning & Evolving (Art. 13): Post-incident reviews, lessons learned process, framework updates, threat landscape monitoring
- Communication (Art. 14): Crisis communication plans, stakeholder notification, regulatory reporting channels
Step 4: Pillar 2 — Incident Reporting Assessment
Pillar 2 (Articles 17-23) requires financial entities to establish and implement processes for detecting, managing, classifying, and reporting ICT-related incidents.
4.1 Incident Management Process
- Is there a documented ICT incident management process covering detection, triage, escalation, and resolution?
- Are roles and responsibilities defined for incident handlers, coordinators, and senior management escalation?
- Are incident classification criteria aligned with DORA's severity thresholds (impact on clients, financial loss, duration, geographical spread)?
- Can the organisation distinguish between ICT-related incidents and major ICT-related incidents?
4.2 Reporting Capability
- Can the organisation produce an initial notification within 4 hours of classifying a major incident, and a full initial report within 24 hours?
- Are intermediate report procedures in place (within 72 hours)?
- Is there a final report process (within 1 month of incident resolution)?
- Are reporting templates aligned with the ITS on content, timelines, and format?
- Has the organisation mapped its reporting channels to the relevant competent authority?
4.3 Voluntary Notifications
- Is there a process for voluntary notification of significant cyber threats?
- Are criteria defined for when a cyber threat is considered significant enough to report?
- Is there awareness of NIS2 cross-reporting obligations where applicable?
Step 5: Pillar 3 — Resilience Testing Assessment
Pillar 3 (Articles 24-27) requires all in-scope financial entities to maintain a digital operational resilience testing programme. Certain entities must also conduct advanced threat-led penetration testing (TLPT).
5.1 Basic Testing Programme
- Is there a documented digital operational resilience testing programme?
- Does the programme include: vulnerability assessments and scans, open-source analysis, network security assessments, gap analysis, physical security reviews, questionnaire-based assessments, source code reviews, scenario-based tests, compatibility testing, performance testing, end-to-end testing, and penetration testing?
- Are critical ICT systems tested at least annually?
- Are test results documented and reported to the management body?
- Are identified vulnerabilities remediated within defined timelines?
5.2 Advanced Testing — TLPT
Assess whether TLPT is required and, if so, readiness:
- Does the entity meet the criteria for mandatory TLPT (e.g., systemically important credit institutions, central counterparties)?
- Is there a TLPT programme aligned with TIBER-EU framework requirements?
- Has the entity identified qualified external threat intelligence and red team providers?
- Are scoping and targeting processes defined for TLPT exercises?
- Are purple-team testing and remediation validation included in the programme?
- Is pooled testing considered for entities sharing common ICT infrastructure?
Step 6: Pillar 4 — Third-Party Risk Assessment
Pillar 4 (Articles 28-44) establishes comprehensive requirements for managing ICT third-party risk, including mandatory contract provisions and the new oversight framework for critical ICT third-party service providers (CTPPs).
6.1 Third-Party Risk Management Framework
- Is there a documented ICT third-party risk management strategy approved by the management body?
- Does the strategy cover the full lifecycle: due diligence, contracting, monitoring, and exit?
- Is there a register of all ICT third-party service providers?
- Are providers classified based on the criticality of functions they support?
- Is concentration risk assessed and monitored?
6.2 Contractual Compliance
- Do contracts include all mandatory clauses specified in Article 30?
- Are service level descriptions, data protection provisions, and audit rights included?
- Are termination conditions and exit strategies defined for critical or important functions?
- Are sub-outsourcing provisions and notification requirements included?
- Are contracts updated for new DORA requirements?
6.3 Register of Information
- Is the Register of Information (RoI) populated as required by Article 28(3)?
- Does it include all ICT services provided by third parties?
- Is it maintained at entity level and, where applicable, at sub-consolidated and consolidated level?
- Can it be made available to competent authorities upon request?
Step 7: Pillar 5 — Information Sharing Assessment
Pillar 5 (Article 45) enables financial entities to exchange cyber threat intelligence and information among themselves within trusted communities.
- Has the entity assessed whether to participate in information-sharing arrangements?
- Are data protection and confidentiality safeguards in place for shared intelligence?
- Is there a documented process for receiving, analysing, and acting on shared threat information?
- Has the entity notified its competent authority of its participation in information-sharing arrangements?
- Are information-sharing agreements documented with appropriate legal protections?
Step 8: Scoring and Rating Gaps
A maturity-based scoring model enables objective gap quantification and progress tracking over time. We recommend a five-level maturity model applied to each requirement:
| Score | Maturity Level | Description |
|---|---|---|
| 1 | Initial | No documented approach; ad-hoc processes; significant gaps exist |
| 2 | Developing | Partial documentation; inconsistent implementation; key gaps remain |
| 3 | Defined | Documented policies and procedures; consistent implementation; minor gaps |
| 4 | Managed | Monitored and measured; evidence of effectiveness; continuous improvement |
| 5 | Optimised | Best-in-class; proactive and adaptive; benchmarked against industry |
Interpreting Scores
A target maturity of Level 3 (Defined) should be considered the minimum for DORA compliance. Financial entities with critical or important functions supported by complex ICT environments should aim for Level 4 (Managed) across Pillars 1-4. A score below Level 2 in any pillar represents a critical gap requiring immediate remediation.
Aggregation and Reporting
Scores can be aggregated to provide pillar-level and overall readiness scores. Use weighted averages to reflect the relative importance of requirements — for example, incident reporting timelines (Pillar 2) and mandatory contract clauses (Pillar 4) may carry higher weight given regulatory scrutiny. Present results visually using radar charts or heatmaps for management reporting.
Step 9: Building the Remediation Plan
The remediation plan translates assessment findings into a structured programme of work with clear ownership, milestones, and resource allocation.
9.1 Prioritisation Framework
Prioritise remediation activities using a risk-based approach:
- Critical (Immediate): Gaps in mandatory reporting capabilities, missing contractual clauses in critical provider contracts, absence of incident classification criteria
- High (0-3 months): Incomplete ICT asset inventories, gaps in resilience testing programmes, missing TLPT planning
- Medium (3-6 months): Enhancement of detection capabilities, refinement of third-party due diligence processes, concentration risk assessment
- Low (6-12 months): Optimisation activities, information-sharing arrangements, advanced monitoring capabilities
9.2 Remediation Plan Structure
Each remediation item should include:
- DORA requirement reference (Article number, RTS/ITS reference)
- Current maturity score and target maturity score
- Gap description and root cause
- Remediation action(s) with specific deliverables
- Owner and accountable executive
- Estimated effort (person-days) and budget
- Target completion date and dependencies
- Key performance indicator (KPI) for tracking progress
9.3 Quick Wins
Identify and execute quick wins to build momentum and demonstrate progress. Common quick wins include:
- Updating incident classification criteria to align with DORA severity thresholds
- Drafting initial notification templates using ITS formats
- Adding mandatory DORA clauses to new third-party contracts
- Starting the Register of Information population
- Formalising management body reporting on ICT risk
Step 10: Compiling the Evidence Pack
The evidence pack is a structured collection of documentation and artefacts that demonstrates DORA compliance to competent authorities. It should be maintained as a living collection, updated as controls mature and evidence is refreshed.
10.1 Evidence Pack Principles
- Traceability: Each evidence item should map directly to a specific DORA requirement
- Currency: Evidence must be current — outdated policies or expired test results undermine credibility
- Completeness: All five pillars must be represented with proportionate depth
- Quality: Approved documents, signed records, and formal outputs carry more weight than drafts
- Accessibility: Evidence should be organised logically and retrievable on request
10.2 Evidence Pack Structure
Organise the evidence pack by pillar with the following structure:
- Section 0 — Governance: Management body mandates, governance charter, ICT risk management framework document, risk appetite statement
- Section 1 — ICT Risk Management: Asset inventory, risk assessments, control implementation evidence, BCP/DRP documents, test results
- Section 2 — Incident Reporting: Incident management procedure, classification criteria, reporting templates, notification channel documentation, incident log
- Section 3 — Resilience Testing: Testing programme, test plans, test results and findings, remediation records, TLPT documentation
- Section 4 — Third-Party Risk: TPRM strategy, provider register, contract extracts (mandatory clauses), due diligence records, exit plans, Register of Information
- Section 5 — Information Sharing: Participation assessment, sharing agreements, data protection measures
Pillar-by-Pillar Evidence Matrix
The following matrix maps specific DORA requirements to the evidence artefacts expected by competent authorities:
| Pillar | Key Requirement | Expected Evidence |
|---|---|---|
| 1 — ICT Risk Mgmt | ICT asset inventory (Art. 8) | Asset register with classification, ownership, and dependency mapping |
| 1 — ICT Risk Mgmt | Risk management framework (Art. 6) | Approved framework document, risk appetite statement, annual review records |
| 1 — ICT Risk Mgmt | BCP/DRP testing (Art. 11) | Business continuity plans, DR test results, restoration time evidence |
| 2 — Incident Reporting | Classification criteria (Art. 18) | Documented criteria aligned with RTS thresholds, classification procedure |
| 2 — Incident Reporting | Reporting timelines (Art. 19) | Notification templates, escalation procedures, authority contact details |
| 3 — Resilience Testing | Testing programme (Art. 24) | Annual testing plan, scope documentation, risk-based testing rationale |
| 3 — Resilience Testing | TLPT (Art. 26-27) | TLPT scoping document, threat intelligence report, red team report, purple team findings |
| 4 — Third-Party Risk | Mandatory contract clauses (Art. 30) | Contract extracts showing mandatory provisions, contract review register |
| 4 — Third-Party Risk | Register of Information (Art. 28) | Populated RoI in required format, update process documentation |
| 5 — Information Sharing | Participation assessment (Art. 45) | Board decision on participation, sharing arrangements, data protection measures |
Common Gaps and Pitfalls
Based on our experience conducting DORA readiness assessments across financial entities, the following gaps are consistently observed:
Pillar 1 — ICT Risk Management
- Incomplete asset inventories: Many organisations have hardware and software inventories but lack mapping to business functions and third-party dependencies
- Risk framework governance: ICT risk management exists operationally but lacks formal management body approval and oversight as required by Article 5
- Learning and evolving: Post-incident review processes exist but are not systematically feeding back into framework updates
Pillar 2 — Incident Reporting
- Classification alignment: Existing incident severity models do not align with DORA's specific criteria (client impact, financial loss, duration, geographical spread)
- Reporting timelines: The 4-hour initial notification requirement is challenging for organisations without 24/7 security operations
- Template readiness: Reporting templates are not aligned with the ITS content requirements
Pillar 3 — Resilience Testing
- Testing coverage: Penetration testing exists but the full range of required testing types (12 types specified in Article 25) is not covered
- TLPT readiness: Entities subject to TLPT have not begun scoping or provider selection
- Critical system identification: Lack of clarity on which ICT systems require annual testing
Pillar 4 — Third-Party Risk
- Contract gaps: Legacy contracts lack DORA mandatory clauses, particularly around audit rights, sub-outsourcing, and exit provisions
- Concentration risk: No formal assessment of concentration risk across ICT third-party providers
- Register of Information: The RoI is incomplete or maintained in formats incompatible with supervisory submission requirements
Pillar 5 — Information Sharing
- No formal assessment: Many entities have not formally assessed whether to participate in information-sharing arrangements
- Legal barriers: Concerns about data protection and competition law hinder participation
Tips for Supervisory Engagement
DORA strengthens the supervisory toolkit available to competent authorities. Financial entities should prepare for increased scrutiny of ICT risk management practices. Here are practical tips for effective supervisory engagement:
Before Engagement
- Know your supervisor: Identify your competent authority and understand their supervisory priorities and approach
- Prepare the evidence pack: Have your evidence pack structured, current, and readily accessible
- Brief the management body: Ensure board members and senior management can articulate the entity's ICT risk posture and DORA compliance approach
- Document proportionality: Where you have applied the proportionality principle, document the rationale clearly
During Engagement
- Be transparent: Acknowledge known gaps and present your remediation plan — supervisors value honesty over perfection
- Show governance: Demonstrate that ICT risk is on the management body's agenda with board minutes and decision records
- Evidence ongoing improvement: Show trend data — maturity scores over time, incident response time improvements, testing coverage expansion
- Have subject matter experts available: Ensure CISO, CIO, and operational risk leads are available for detailed questions
After Engagement
- Document all findings, observations, and recommendations from the supervisor
- Update the remediation plan to address supervisory feedback
- Report supervisory outcomes to the management body
- Track and close findings within agreed timelines
- Maintain the evidence pack as a living document
The most effective supervisory engagements are those where the financial entity demonstrates a proactive, risk-based approach to DORA compliance rather than a reactive, checkbox-driven one. Competent authorities look for genuine resilience capability, not just documentation.
Frequently Asked Questions
How long does a DORA readiness assessment take?
A comprehensive DORA readiness assessment typically takes 6-12 weeks depending on the complexity of ICT infrastructure and the number of in-scope entities. Organisations with existing ISO 27001 or NIS2 frameworks can accelerate this to 4-6 weeks by leveraging existing documentation and control evidence.
What are the five pillars of DORA?
The five pillars of DORA are: (1) ICT Risk Management, (2) ICT-Related Incident Reporting, (3) Digital Operational Resilience Testing, (4) ICT Third-Party Risk Management, and (5) Information Sharing. Each pillar has specific requirements defined in the Regulation and supplemented by Regulatory Technical Standards (RTS) and Implementing Technical Standards (ITS).
What should a DORA evidence pack contain?
A DORA evidence pack should contain ICT risk management framework documentation, asset inventories, risk assessment records, incident reporting procedures and templates, resilience testing plans and results, third-party contract registers with mandatory clauses, due diligence records, the Register of Information, and board-level governance evidence such as meeting minutes and management body approvals.
Can we use our existing ISO 27001 ISMS for DORA compliance?
Yes, organisations with an existing ISO 27001 ISMS have a strong foundation for DORA compliance. Many ICT risk management and governance requirements overlap, particularly around asset management, risk assessment, access control, and incident management. However, DORA introduces specific requirements around incident reporting timelines (4h/24h/72h/1 month), mandatory TLPT testing, Article 30 third-party contract clauses, concentration risk, and the Register of Information that go beyond ISO 27001.
Do we need an external assessor for DORA readiness?
While DORA does not mandate external readiness assessments, engaging an independent assessor provides objectivity, identifies blind spots internal teams may overlook, and produces evidence that demonstrates due diligence to competent authorities. External assessment is particularly valuable for TLPT scoping and execution, third-party contract review, and concentration risk assessment.