At a Glance
Banks need a clear data capability baseline by mapping lineage, defining ownership, standardizing quality metrics, and exposing control gaps, enabling accountable stewardship, trusted reporting, regulatory alignment, and scalable analytics and AI execution.
Why data baselining is now a prerequisite for realistic ambition
In 2026, bank strategies increasingly assume real-time decisioning, AI-assisted operations, and regulatory-grade reporting on demand. Those ambitions rise or fall on data capability, not on isolated analytics tools. A data capability baseline therefore defines the minimum architectural, governance, and operational standards required to deliver decisions and reports that are timely, explainable, and defensible under supervisory scrutiny.
The baseline is deliberately not a score. It is a set of observable capabilities that can be evidenced: whether risk data is materially accurate and reconciled; whether the bank can aggregate material risk data across the institution; whether reporting frequency can increase during stress; and whether the data platform can adapt quickly to regulatory or organizational change. For systemically important banks, these expectations map directly to BCBS 239 principles for risk data aggregation and risk reporting.
BCBS 239 as the hard minimum for SIB-grade baselines
BCBS 239 remains the most consequential baseline reference because it translates “good data” into explicit risk aggregation and reporting requirements. In 2026, leaders use BCBS 239 to pressure-test strategic plans that expand digital distribution, introduce new products, or increase automation. If the bank cannot produce materially accurate, reconciled risk data quickly, the strategy will carry hidden operational and regulatory exposure.
Baseline test: Can the bank produce daily risk reports in normal conditions and increase frequency during stress, without manual assembly or exception-heavy reconciliations?
- Accuracy and reconciliation: risk data is materially accurate and tied back to authoritative sources.
- Completeness across the enterprise: material risk data can be captured across entities, products, and geographies.
- Timeliness under stress: reporting can accelerate when conditions deteriorate.
- Adaptability: data systems can absorb regulatory change, business reorganizations, and new risk views without fragile rebuilds.
From system of record to strategic intelligence layer
The “bank of tomorrow” baseline is no longer anchored to a single system of record. It is anchored to a strategic intelligence layer that makes data usable for real-time operations and AI. In 2026, this baseline is defined by traceability, automation, and operational readiness rather than periodic reporting heroics.
A key practical shift is the expectation that data lineage can be tracked from origin to report automatically. Where audit and regulatory reviews still depend on manual data assembly, the bank’s operational model is signalling that data capability is not ready for scaled automation or broader ecosystem participation.
Capability baselines without maturity language: operating states banks recognize
Industry benchmarks often describe staged progressions (for example, governance models that range from basic awareness to enterprise-wide optimization). For executive baselining, the more useful framing is operating states that can be evidenced in day-to-day work, without relying on maturity labels.
| Operating state | What it looks like | What breaks at scale |
|---|---|---|
| Reactive | Data quality issues are handled as incidents; ownership is unclear; fixes happen downstream. | AI projects stall; risk reporting depends on manual workarounds; controls become exception-driven. |
| Stewarded | Formal stewardship roles exist; operational metadata is captured; common definitions start to stabilize. | Coverage remains uneven across domains; cross-entity aggregation is slow; lineage is partial. |
| Coordinated | Governance is coordinated across enterprise initiatives; shared services and tooling reduce duplication. | Value delivery can still be constrained by upstream data debt and inconsistent product-to-entity mapping. |
| Productized | Data is treated as products with accountable owners; contracts, quality gates, and usage patterns are managed. | Requires strong control evidence and platform observability; weak identity and access controls become the limiting factor. |
| Strategic intelligence | Real-time data products support decisioning and reporting; lineage and controls are embedded; change is repeatable. | Failure modes concentrate in third-party dependencies, model governance, and real-time resilience disciplines. |
These states help leaders baseline the starting point in operational terms: how work gets done, how evidence is produced, and what constraints will cap delivery speed.
What must be in the 2026 data capability baseline
A decision-grade baseline should cover capabilities that directly constrain real-time operations, AI integration, and regulatory compliance. The table below provides a practical baseline checklist expressed as capabilities and required evidence.
| Baseline domain | Minimum capability | Evidence artifacts executives should require |
|---|---|---|
| Risk data aggregation (BCBS 239) | Enterprise risk data can be aggregated, reconciled, and reported quickly in normal and stressed conditions. | Risk data lineage maps; reconciliation controls; stress reporting playbooks; data issue registers with remediation SLAs. |
| Traceability and lineage | Origin-to-report traceability is automated for material datasets and reports. | Lineage tooling outputs; audit trails; metadata standards; data contracts for critical entities and measures. |
| Data execution quality | Quality gates prevent recurring upstream defects; “golden sources” exist for critical entities. | DQ rule catalog; defect trends; entity resolution outputs; exception volumes by root cause; quality gate enforcement logs. |
| Real-time decision readiness | Data products and platforms support low-latency consumption for material use cases (fraud, credit, payments, liquidity). | Latency SLOs; event schema ownership; real-time reconciliation design; observability dashboards tied to outcomes. |
| AI governance | Accountability exists for algorithmic controls beyond accuracy, including fairness and explainability where applicable. | Model governance RACI; monitoring and drift reports; human-override controls; review minutes and decision logs. |
| Access, privacy, and residency | Access is continuously verified; sensitive data handling aligns to residency and jurisdictional rules. | Identity access telemetry; privileged access reviews; residency maps; data sharing approvals; retention and deletion controls. |
| Open finance access schemes | Permissioned data access is managed with clear consent controls and auditability (including GDPR-compatible permissions). | Consent records; API access logs; third-party onboarding controls; data sharing registers aligned to FIDA-type obligations. |
Platform baselining: what “modern core support” means for data
When banks say the core is “modernized,” the baseline question is whether the core and surrounding platforms support operational intelligence, not just system stability. In 2026, the data baseline typically requires that core and adjacent platforms can produce real-time dashboards for customer insights and near-real-time views of risk exposure, without parallel shadow ledgers or reconciliation-heavy reporting pipelines.
The platform baseline should also make explicit where real-time capability is genuine versus simulated through caching, batch acceleration, or manual overrides. These distinctions matter because they determine control scalability when volumes rise, settlements accelerate, and exception windows shrink.
Role baselines: data stewardship and data products
A recurring 2026 baseline pattern is the appointment of accountable owners for datasets as products. Titles vary, but the responsibility is consistent: define data contracts, manage quality gates, maintain metadata, and ensure cross-department usability. This turns “data governance” into an operating model that can be tested through evidence, not a policy statement.
- Data product ownership: accountable owner, backlog, consumers, and SLOs for quality and timeliness.
- Stewardship and controls: definition governance, lineage maintenance, and escalation for recurring defects.
- Operational metadata: searchable definitions, ownership, and usage signals that support day-to-day decisioning.
Establishing a capability baseline that supports realistic prioritization
Capability baselining becomes decision-useful when it creates a consistent fact base across domains that do not naturally align, such as risk reporting, AI governance, real-time engineering, and privacy. A structured assessment provides that comparability by translating evidence into consistent capability statements that can be used to validate whether strategic ambitions are realistic under current data constraints.
Used in strategy validation and prioritization, the assessment connects data baselines to the trade-offs leaders must make in 2026: faster decisions versus traceability, broader ecosystem access versus privacy and residency obligations, and automation versus control evidence and accountability. Applied in this way, DUNNIXER helps executives identify the binding constraints, sequence remediation, and improve decision confidence through the DUNNIXER Digital Maturity Assessment.
Reviewed by

The Founder & CEO of DUNNIXER and a former IBM Executive Architect with 26+ years in IT strategy and solution architecture. He has led architecture teams across the Middle East & Africa and globally, and also served as a Strategy Director (contract) at EY-Parthenon. Ahmed is an inventor with multiple US patents and an IBM-published author, and he works with CIOs, CDOs, CTOs, and Heads of Digital to replace conflicting transformation narratives with an evidence-based digital maturity baseline, peer benchmark, and prioritized 12–18 month roadmap—delivered consulting-led and platform-powered for repeatability and speed to decision, including an executive/board-ready readout. He writes about digital maturity, benchmarking, application portfolio rationalization, and how leaders prioritize digital and AI investments.
References
- https://www.ovaledge.com/blog/bcbs-239-principles
- https://atlan.com/know/gartner/data-governance-maturity-model/
- https://www.dataversity.net/articles/data-management-trends/
- https://www.ovaledge.com/blog/bcbs-239-principles#:~:text=Strong%20governance%20and%20accountability,Active%20supervisory%20oversight
- https://www.ovaledge.com/blog/bcbs-239-principles#:~:text=Next%2C%20banks%20should%20build%20or,always%20meet%20BCBS%20239%20principles.
- https://intrinio.com/blog/data-governance-in-financial-services-building-framework#:~:text=The%20regulatory%20and%20technological%20environment,the%20risk%20of%20sophisticated%20breaches.
- https://www.alation.com/blog/data-governance-best-practices/#:~:text=Integrity:%20Ensure%20all%20data%20is,every%20dataset%20and%20governance%20task.
- https://www.hsfkramer.com/insights/reports/2026/fsr-outlook-2026/top-5-whats-in-store-for-financial-services-in-2026#:~:text=The%20AI%20Act%20imposes%20obligations,and%20audit%20readiness%20are%20essential.
- https://www.ibm.com/new/product-blog/bcbs239-compliance#:~:text=Overarching%20governance%20and%20infrastructure,data%20based%20on%20requested%20scenarios.
- https://www.hsfkramer.com/insights/reports/2026/fsr-outlook-2026/top-5-whats-in-store-for-financial-services-in-2026#:~:text=FIDA%20establishes%20regulated%20schemes%20for,on%20data%20holders%20and%20users.
- https://www.linkedin.com/posts/fr%C3%A9d%C3%A9ric-brunier_accenture-banking-trends-2026-activity-7417589897987006464-3oGl#:~:text=Report%20this%20post-,Core%20banking%20is%20approaching%20a%20critical%20inflection%20point.,Close%20menu
- https://www.ovaledge.com/blog/data-governance-in-banking
- https://www.pwc.com/m1/en/publications/documents/core-banking-transformation-seizing-the-digital-opportunity.pdf
- https://www.crnrstone.com/gritty-insights/research/improving-your-financial-institutions-data-execution-quality#:~:text=Data%20Execution%20Quality%20(EQ),and%20data%20access%20and%20analysis.
- https://www.gci-ccm.org/insight/2025/11/financial-services-compliance-2025-lessons-learned-systemic-shifts-and-road-ahead#:~:text=In%202026%2C%20competence%20in%20data,expertise%20with%20advanced%20technical%20capability.&text=2025%20challenged%20the%20financial%20services,and%20worthy%20of%20public%20confidence.
- https://www.backbase.com/banking-predictions-report-2026/ai-and-the-future-of-banking#:~:text=As%20deepfakes%20and%20AI%2Dgenerated,to%20guard%20against%20synthetic%20media.
- https://www.dawiso.com/blog-post/9-financial-data-compliance-challenges-banks-must-solve-in-2026#:~:text=The%20EU's%20T+1%20settlement,custody%2C%20and%20treasury%20systems).
- https://www.ovaledge.com/blog/data-governance-as-a-service#:~:text=You're%20starting%20data%20governance,and%20automation%20without%20owning%20infrastructure
- https://validio.io/blog/bcbs-239-from-regulatory-burden-to-data-quality#:~:text=aggregation%20and%20reporting.-,Effective%20compliance%20requires%20monitoring%20of%20data%2C%20metrics%2C%20and%20reporting,challenges%20posed%20by%20the%20regulation.
- https://www.linkedin.com/pulse/state-core-banking-2026-tom-phillips-gj31e