← Back to US Banking Information

As-Is Operating Model Assessment for Banks in 2026

Current-state documentation artifacts that baseline how work, controls, data, and technology actually operate before modernization

InformationFebruary 13, 2026

Reviewed by

Ahmed AbbasAhmed Abbas

At a Glance

An as-is operating model assessment helps banks evaluate governance, roles, processes, and decision rights. By exposing inefficiencies, silos, and risk gaps, it creates a clear baseline for redesign, supports accountability, and enables scalable transformation aligned with strategic goals.

Why the operating model baseline is a governance instrument in 2026

An as-is operating model assessment is the discipline of documenting how the bank truly runs today across processes, technology, data, controls, and people. In 2026, this has become a transformation governance requirement: leaders need an objective baseline that can withstand audit scrutiny, support investment sequencing, and make trade-offs explicit between speed, resilience, and compliance.

The primary change is the move from experimental AI pilots to governed intelligence. Automation that is fast but not explainable, controllable, and evidence-producing creates new model risk, conduct risk, and operational risk. A credible current-state baseline therefore needs to describe not only what is automated, but how automation is governed, where decisions are made, and how controls and evidence are produced at run time.

Core pillars of a 2026 operating model assessment

Modern operating model baselines go beyond “people, process, technology” to include data governance and service delivery realities. The purpose is to capture the constraints that will determine whether modernization benefits can be realized without creating unacceptable risk exposure.

Technology and infrastructure

Current-state artifacts should describe the platform landscape that shapes delivery and run costs: legacy cores, middleware, integration patterns, environment topology, and operational tooling. In 2026, many banks are testing AI-assisted refactoring and modernization approaches, which makes it essential to document where code quality, test coverage, and release controls will limit safe acceleration.

Data foundation

Data readiness is an operating model constraint, not only an architectural issue. The baseline should document accuracy, timeliness, lineage, access controls, and the operational routines used to remediate data quality. “Brittle” data pipelines and unclear ownership frequently become the single largest blocker to scaling AI, because they undermine explainability and create inconsistent decision outcomes across channels.

Processes and automation

Process mapping should focus on value chains such as onboarding, servicing, and lending, capturing where manual work persists, where control checks occur, and where handoffs cause queueing and rework. The assessment should also identify which tasks are candidates for end-to-end automation and which must remain human-governed due to policy, conduct, or model risk constraints.

Governance and compliance

Operating model baselines must document how risk and compliance are executed in practice: decision rights, escalation paths, policy interpretation points, and evidence generation. For banks with BCBS 239 obligations and broader supervisory expectations for risk data aggregation and reporting, the baseline should specify how data and model locations are evidenced, how changes are approved, and how control effectiveness is tested over time.

People and talent

Talent documentation should focus on capability coverage and accountability, not headcount. The baseline needs to capture where critical skills sit (data governance, model operations, cyber defense, product management, platform engineering), how work is organized, and how incentives influence behaviors. “10x bank” aspirations—where employees orchestrate AI teams—depend on role clarity, training pathways, and governance that prevents shadow automation.

2026 macroeconomic and regulatory pressures that shape the baseline

Operating model assessments in 2026 must explicitly incorporate external pressures because they change what is feasible and what is prudent. Baseline language should therefore include assumptions and constraints tied to profitability, regulation, and jurisdictional requirements.

Net interest margin pressure and cost discipline

With rate expectations normalizing, banks are using operating model baselines to identify back-office cost removal opportunities that do not compromise resilience. The baseline should show which cost drivers are structural (platform duplication, manual controls, fragmented data) versus variable (volume-driven capacity), so cost programs do not inadvertently increase operational risk.

Regulatory milestones and new money rails

Readiness for stablecoins, tokenized deposits, and new forms of programmable value transfer is increasingly becoming a governance topic rather than a product topic. A current-state baseline should document how new asset types would be onboarded, how AML and sanctions controls would be applied, and how settlement, reconciliation, and custody responsibilities are assigned.

Sovereignty and jurisdictional control requirements

Data and AI sovereignty expectations require operating model clarity. Baseline artifacts should specify where data resides, where models are trained and executed, who can access them, and which third parties are involved. This is not only a cloud assessment; it is a control and accountability baseline that affects resilience, incident response, and supervisory defensibility.

Practical methodology that produces signable current-state artifacts

An operating model baseline is only useful if it results in artifacts that can be reviewed, challenged, and signed off. A practical approach emphasizes traceability: from process steps, to systems and data, to controls and evidence, to accountability.

1) Stakeholder workshops

Facilitated sessions with domain experts identify all business and system actors involved in a process, including control functions and third parties. Workshops should explicitly surface non-negotiables such as resilience objectives, customer outcome standards, and risk appetite boundaries.

2) Process inventory and rules capture

Document high-level business steps (for example, “customer logs in” or “credit decision is issued”) and the business rules that govern them (for example, identity requirements or affordability checks). This creates a baseline for where policy is applied today and where it is inconsistently applied across channels.

3) Dependency mapping

Map system touchpoints, manual inputs (spreadsheets, emails, offline approvals), and automated outputs, including where evidence is produced or missing. The dependency map should identify concentration risk (single systems, single teams, or single vendors) and the potential blast radius of change.

4) Gap identification for the to-be baseline

Compare the documented scope of the business process and control expectations with current limitations: latency, manual work, control gaps, data quality, and unclear ownership. The output should be a constraint register that leadership can fund against, rather than a generic list of “improvements.”

5) Audit trail and formal sign-off

Circulate draft maps and artifacts (BPMN and UML where appropriate) for validation. Formal sign-off should include business owners and relevant control functions, ensuring the baseline becomes the authoritative starting point for change governance and progress tracking.

Minimum documentation artifacts executives should expect

To function as a transformation baseline, operating model documentation should be complete enough to support portfolio decisions and control assurance. Typical artifacts include:

  • Journey and value-chain maps for priority domains (onboarding, servicing, lending), including control points
  • System and integration maps that identify critical dependencies and high-risk coupling
  • Data domain ownership and lineage views for key regulatory and decisioning datasets
  • Control-evidence maps showing where evidence is generated, stored, and reviewed
  • Role and decision-rights definitions for product, technology, and control functions
  • Constraint register and sequencing hypotheses tied to measurable outcomes and risk limits

These artifacts enable consistent re-measurement. Without that consistency, “progress” becomes a narrative rather than a tracked trajectory.

Baselining transformation governance for objective starting points and progress tracking

Operating model documentation becomes materially more decision-useful when it is anchored to a repeatable baselining discipline: consistent definitions, shared evidence standards, and clear accountability for validation. This reduces governance friction when transformation accelerates and helps leaders distinguish constraints that can be resolved locally from constraints that require enterprise decisions, such as control redesign, data ownership remediation, or platform standardization.

Within that governance framing, an assessment approach such as the DUNNIXER Digital Maturity Assessment helps executives evaluate readiness and sequencing confidence using the same current-state artifacts already produced in the operating model review. Dimensions around governance effectiveness, data foundations, service delivery enablement, and control evidence integrity can be mapped to the specific trade-offs highlighted in the baseline: scaling AI from pilots to governed intelligence, accelerating automation without weakening explainability, and meeting sovereignty constraints while sustaining resilience outcomes.

Related Briefs

Reviewed by

Ahmed Abbas
Ahmed Abbas

The Founder & CEO of DUNNIXER and a former IBM Executive Architect with 26+ years in IT strategy and solution architecture. He has led architecture teams across the Middle East & Africa and globally, and also served as a Strategy Director (contract) at EY-Parthenon. Ahmed is an inventor with multiple US patents and an IBM-published author, and he works with CIOs, CDOs, CTOs, and Heads of Digital to replace conflicting transformation narratives with an evidence-based digital maturity baseline, peer benchmark, and prioritized 12–18 month roadmap—delivered consulting-led and platform-powered for repeatability and speed to decision, including an executive/board-ready readout. He writes about digital maturity, benchmarking, application portfolio rationalization, and how leaders prioritize digital and AI investments.

References