The DORA ICT third-party risk register — officially the Register of Information (RoI) — is the single most data-intensive deliverable in the entire DORA framework. It is mandated by Article 28(3) of Regulation (EU) 2022/2554 and structured by Implementing Technical Standard (ITS) 2024/2956. Annual submission to the competent authority is required by 30 April each year, with the first cycle completed in 2025. This guide provides a definitive methodology for building and maintaining the register.

What the Register of Information Is — in 30 Seconds

The Register of Information is a machine-readable, xBRL-CSV submission documenting every contractual arrangement between a financial entity and its ICT third-party service providers. It is not a vendor inventory, not a risk register, and not a procurement database — it is a regulatory dataset that supervisors use to map systemic ICT concentrations across the EU financial sector.

AttributeValue
Legal basisDORA Article 28(3) & (9), supplemented by ITS 2024/2956
FormatxBRL-CSV package (taxonomy-bound)
Submission cadenceAnnual, deadline 30 April; ad hoc upon supervisor request
Number of templates9 relational tables (B_01 through B_07 plus reference tables)
ScopeAll in-scope financial entities under DORA Article 2
RecipientNational competent authority (NCA), aggregated to ESAs

The 9 Mandatory Templates

ITS 2024/2956 defines a relational data model where each table has a specific purpose and is joined to others by reference keys. Get one table wrong and the validation rejects the entire submission.

1. RT.01.01 — Entity maintaining the register

Identification of the financial entity submitting the RoI: legal entity identifier (LEI), country, type, branch information for cross-border groups. One row per submitting entity.

2. RT.01.02 — Branches of the financial entity

Branches in scope of the consolidated submission, with their LEI and country of establishment.

3. RT.01.03 — ICT third-party service providers (master list)

Every direct ICT provider, with LEI, country, name, type, and Critical ICT Third-Party Provider (CTPP) designation status under DORA Article 31.

4. RT.02.01 — Contractual arrangements

One record per contract: counterparty, reference date, governing law, notice period, criticality flag (this is where the Critical or Important Function (CIF) classification flows through).

5. RT.02.02 — Annual costs of contractual arrangements

Actual prior-year spend and estimated current-year cost per arrangement, in EUR. This is the dataset supervisors use for concentration analysis.

6. RT.03.01 — ICT services received

Each contract decomposed into individual service lines using the ITS service taxonomy. A single contract typically generates multiple rows here.

7. RT.04.01 — Functions supported by ICT services

The link table mapping each ICT service to the business function(s) it enables. Every row carries a CIF flag. Misclassify your CIFs and this entire table propagates errors through your compliance chain.

8. RT.05.01 — Sub-outsourcing of ICT services

Material sub-contractors of providers delivering services to critical or important functions. Required regardless of whether the financial entity has a direct contract with the sub-contractor.

9. RT.06.01 — ICT assets (significant institutions, optional in 2025/2026)

Servers, applications, databases associated with each service. Optional for non-significant entities in the 2025/2026 cycles, expected to become mandatory for all in-scope entities from 2027.

The xBRL-CSV Format: What Validators Reject

Submissions must arrive as a complete xBRL-CSV package — a folder containing a JSON metadata file (report-package.json), one CSV file per template using the ESA taxonomy naming convention, and a reference to the published ESA taxonomy version. The most common automated rejection causes:

  • Encoding errors. CSVs must be UTF-8 with BOM, comma-separated, with strict quote-escaping. Excel exports often fail validation here.
  • LEI integrity. Every LEI must validate against the GLEIF database and be active at the submission date. Lapsed LEIs cause hard rejections.
  • Cross-table referential integrity. A service line in RT.03.01 referencing a contract that does not exist in RT.02.01 will fail.
  • Currency consistency. All monetary values in EUR. Multi-currency contracts require conversion at the entity's standard FX rate, with the rate documented separately.
  • CIF flag inconsistency. If RT.04.01 flags a service as supporting a CIF, RT.02.01 must mark that contract as critical. The validator catches mismatches.

The 6-Step Methodology to Build the Register

Step 1 — Inventory all ICT third-party arrangements

Start broader than DORA strictly requires. Pull every active vendor contract from procurement, finance and legal systems. Filter for ICT services using the functional definition: any service that processes, stores, transmits or controls digital information. SaaS, cloud, hosting, managed security, integrators, software licences with support — all in scope. Do not rely on accounting categories.

Step 2 — Map each ICT service to the supported business function

For every service identified in Step 1, document which business function(s) it supports. This is where the methodology connects to your CIF identification process: services supporting a CIF inherit the CIF flag. Without this mapping you cannot populate RT.04.01 correctly — and without RT.04.01 your whole submission lacks the criticality data supervisors actually examine first.

Step 3 — Collect the mandatory data fields per ITS 2024/2956

For each provider, contract and service, extract the fields required by the templates. Most institutions discover gaps in this step: missing LEIs for smaller providers, undefined notice periods, governing law not explicitly stated in the contract. Each gap requires either contract amendment or documented assumption with sign-off.

Step 4 — Apply the CIF flag and validate consistency

Run the CIF flag through every record using the three Article 3(22) criteria: financial performance impairment, soundness or continuity, authorisation compliance. The flag must be consistent across RT.02.01 (contract) and RT.04.01 (function-service mapping). Inconsistencies are the single most common cause of supervisory follow-up queries.

Step 5 — Generate the xBRL-CSV package and run pre-validation

Most NCAs publish a pre-validation tool or Excel-to-xBRL converter. Use it. Submit a test package, fix every validator warning, then resubmit. Treat warnings as errors — ESAs have indicated that validator warnings will progressively become hard rejections in future cycles.

Step 6 — Submit, archive, prepare for change tracking

Submit via your NCA's designated portal (eIDAS authentication required in most jurisdictions). Archive the exact submitted package and the taxonomy version used. Then put change-tracking in place: any new contract, terminated contract, or material amendment becomes a delta to track for the next annual submission.

The 5 Most Common Pitfalls

  1. Treating it as a procurement export. Procurement systems track spend, not regulatory criticality. Building the register from a procurement extract typically produces a 30-40% data quality gap.
  2. Mis-flagging CIFs. Either too few flags (under-classification, supervisors will challenge) or too many (over-classification, triggers strict obligations on services that do not need them).
  3. Missing sub-outsourcing chains. Article 30 requires registering material sub-outsourcing for critical services. Most institutions miss the second-tier sub-contractors of their managed service providers.
  4. Static maintenance. The register is a living dataset. Institutions that update it once per year discover at the next submission that they have lost track of dozens of contractual changes.
  5. No reconciliation with incident reports. When a supervisor cross-references your incident reports against your RoI, the providers cited must match. Mismatches generate automated queries.

Penalties for Register Failures

Article 50 allows competent authorities to impose administrative penalties for breaches of Article 28. In practice for register-related failures the enforcement ladder is:

  • Submission delay (a few days): formal warning + remediation order, no fine in most jurisdictions for a first occurrence.
  • Material data quality issues: supervisory letter requiring corrective action plan, follow-up examination.
  • Persistent non-compliance: periodic penalty payments under Article 35, calibrated to compel correction.
  • Wilful misreporting: administrative fine up to 1% of average daily worldwide turnover (Article 35), and potential personal liability for senior management under national administrative law.

Frequently Asked Questions

Do we need to register intra-group ICT services?

Yes. Article 28 makes no exception for intra-group arrangements. Intra-group ICT services that support critical or important functions are registered with the same depth as third-party services, with the intra-group nature flagged separately.

What if we cannot get a LEI for a small provider?

Smaller providers are required to register an LEI as a condition of providing services to in-scope financial entities. The institution should obtain the LEI before signing or renewing the contract. For pre-existing contracts where the provider refuses, the register must document the gap and the remediation plan.

How granular should the service decomposition be?

Use the ITS taxonomy categories as the granularity baseline. A single SaaS contract typically decomposes into 3-7 service lines (the application, data hosting, backup/recovery, support, monitoring, etc.). Aggregating multiple distinct services into one row will trigger validation queries.

Can we delegate the register to our outsourcing provider?

No. Article 28(3) places the obligation on the financial entity. Providers can supply the data inputs, but accountability for accuracy, completeness and submission rests exclusively with the regulated entity.

How Resiplan Automates the Register of Information

Resiplan is the specialised SaaS for DORA, business continuity and GRC. The platform implements the entire register methodology: ICT inventory, CIF mapping (via the dedicated CIF Evaluation Module), automated xBRL-CSV generation aligned with the latest ESA taxonomy, and continuous change tracking between submissions. The 9 ITS templates are populated from a single source of truth, with built-in cross-table validators catching the integrity issues that cause most automated rejections.

The Register of Information is the foundational dataset of DORA supervisory oversight. Treating it as an annual form-filling exercise is a strategic error — the data you submit becomes the lens through which every subsequent supervisory review of your institution is conducted.