Capability dimensions are the fundamental facets or aspects through which enterprise capabilities are assessed, measured, and managed. They represent the multi-dimensional lens through which organizations evaluate capability health, maturity, performance, and strategic value. This framework provides enterprise architects with a canonical, standards-grounded model for defining dimensions, establishing measures, and deriving actionable KPIs across business, technology, application, data, and security capabilities.
? Core Insight
Capability dimensions are not the capabilities themselves—they are the measurement and assessment axes that reveal capability state, quality, and value
? Three Universal Dimensions
Strategic Importance, Capability Maturity, and Adaptability form the foundational triad for all capability assessments
? Cross-Framework Alignment
Dimensions map directly to TOGAF, COBIT, CMMI, NIST, and SAFe constructs, enabling integrated governance
1. What Are Capability Dimensions?
1.1 Foundational Definition
Capability dimensions are the distinct facets or perspectives through which an enterprise capability is evaluated, representing the measurable attributes that collectively determine a capability’s fitness, value, and evolution potential.
While a capability defines what an organization can do (e.g., “Customer Relationship Management,” “Threat Detection”), dimensions describe how well the capability performs along critical evaluation axes such as maturity, strategic alignment, adaptability, performance, and complexity.
Key Distinction:
- Capability: The ability or capacity to execute specific functions (e.g., “Payment Processing”)
- Capability Dimension: The measurement axis for evaluating that capability (e.g., “Process Maturity of Payment Processing,” “Strategic Importance of Payment Processing”)
1.2 Why Capability Dimensions Exist
Capability dimensions serve four critical purposes:
- Multi-Faceted Assessment: Capabilities are complex constructs; single metrics cannot capture their full state. Dimensions provide comprehensive evaluation coverage.
- Prioritization & Investment: Dimensions enable objective comparison and ranking of capabilities for portfolio investment decisions.
- Gap Analysis: By measuring current and target states across dimensions, architects identify specific improvement areas.
- Transformation Roadmapping: Dimensions inform sequencing—low maturity + high strategic importance = priority transformation candidates.
1.3 Dimensions vs. Capability Elements
It’s critical to distinguish dimensions from capability elements (also called components or resources):
| Concept | Definition | Example |
| Capability | The organizational ability | “Data Quality Management” |
| Capability Elements/Components | The building blocks that enable the capability | People, Process, Technology, Information, Governance |
| Capability Dimensions | The measurement axes that assess the capability | Strategic Importance, Maturity, Adaptability, Performance, Risk |
Example: The capability “Cybersecurity Threat Detection” is enabled by people (security analysts), processes (incident response), technology (SIEM), information (threat intelligence), and governance (security policies). It is assessed alongdimensions such as maturity (NIST CSF Tier), strategic importance (alignment to risk strategy), and performance (Mean Time to Detect).
2. Canonical Capability Dimensions: Enterprise-Wide Model
Based on TOGAF, COBIT, CMMI, and industry best practices, the following canonical dimensions apply across all capability domains (business, technology, application, data, security):
2.1 Three Foundational Assessment Dimensions
These three dimensions form the core assessment framework recommended by leading enterprise architecture practices:
Strategic Importance
How critical is this capability to business strategy execution and competitive advantage?
Capability Maturity
How well-developed, standardized, and optimized is the capability’s execution?
Adaptability
How easily can the capability be changed to meet evolving requirements?
Dimension 1: Strategic Importance
Definition: The relevance or criticality of a capability to achieving the enterprise’s strategic goals, business model execution, and competitive positioning.
Purpose: Prioritizes capabilities that warrant highest investment and governance attention.
Measurement Focus:
- Contribution to strategic objectives and key results (OKRs)
- Alignment to business model value propositions
- Impact on competitive differentiation
- Future opportunity enablement
- Revenue/value contribution
Assessment Method: Stakeholder consensus scoring (1-5 scale) based on strategic alignment criteria.
Dimension 2: Capability Maturity
Definition: The degree to which a capability is well-designed, consistently executed, measured, and continuously improved across its enabling dimensions.
Purpose: Identifies where capabilities underperform or lack standardization, requiring improvement investment.
Measurement Focus: Maturity assessed across capability-enabling dimensions (detailed in Section 3):
- People & Competencies
- Process Standardization
- Technology Enablement
- Information/Data Quality
- Governance & Accountability
Assessment Method: Structured scoring (typically 1-5 CMMI-aligned scale) based on defined maturity level criteria.
Dimension 3: Adaptability
Definition: The ease or difficulty with which a capability can be modified to respond to changing business requirements, customer needs, or environmental conditions.
Purpose: Predicts transformation cost, duration, and risk—highly strategic but rigid capabilities may require foundational change before improvement.
Measurement Focus:
- Flexibility to environmental changes
- Responsiveness to customer need shifts
- Scalability to demand changes
- Technology modernization readiness
- Organizational change readiness
Assessment Method: Expert evaluation of capability flexibility across environmental, customer, and demand dimensions.
2.2 Extended Enterprise Dimensions
Beyond the foundational three, enterprise architects often assess capabilities across additional dimensions:
| Dimension | Definition | Primary Use Case | Measurement Focus |
| Performance | Current operational effectiveness against targets | Heat mapping, KPI tracking | Outcome KPIs, efficiency metrics, quality measures |
| Complexity | Structural, technical, organizational, and process difficulty | Transformation planning, risk assessment | Component count, dependency depth, stakeholder count, integration points |
| Risk | Vulnerability to disruption (operational, competitive, regulatory) | Resilience planning, continuity management | Single points of failure, compliance gaps, threat exposure |
| Investment Level | Current funding relative to strategic importance | Portfolio rationalization | TCO, operational spend, capital investment |
| Value Delivery | Business outcomes and benefits realized | ROI analysis, value stream optimization | Revenue impact, cost savings, customer satisfaction |
| Coverage | Geographic, organizational, or functional reach | Standardization initiatives, shared services | Organizational unit adoption, geographic footprint |
3. Capability-Enabling Dimensions (Maturity Sub-Dimensions)
When assessing capability maturity, architects evaluate maturity across the enabling dimensions—the fundamental components that constitute and operationalize the capability. These are derived from the People, Process, Technologyframework and extended by COBIT and TOGAF standards.
3.1 The Five Enabling Dimensions (COBIT 2019 Aligned)
COBIT 2019 defines seven governance system components that apply to IT capabilities; the enterprise architecture community commonly consolidates these into five enabling dimensions applicable to all capability types:
-
People & Competencies
Skills, knowledge, experience, roles, and accountability structures required to execute the capability effectively
-
Process & Procedures
Standardized workflows, practices, policies, and governance frameworks that guide capability execution
-
Technology & Infrastructure
Applications, platforms, tools, and infrastructure that enable and automate capability delivery
-
Information & Data
Data quality, availability, governance, and information flows that support decision-making and operations
-
Governance & Culture
Decision rights, accountability models, policies, culture, ethics, and behavioral norms that shape capability performance
3.2 Enabling Dimensions Detailed
People & Competencies Dimension
Definition: The human capital, skills, knowledge, experience, and organizational structures required for capability execution.
COBIT Mapping: “People, Skills, and Competencies” component
What to Measure:
- Skill coverage and proficiency levels
- Role clarity and accountability (RACI)
- Training completion and certification rates
- Workforce capacity vs. demand
- Succession planning maturity
- Employee engagement and retention
Example Assessment Statements (Likert scale scoring):
- “Each role has clearly defined competency requirements.”
- “Training programs are available and employees are certified.”
- “Capability performance is not dependent on key individuals (hero culture eliminated).”
Process & Procedures Dimension
Definition: The standardized, repeatable workflows, practices, policies, and governance frameworks that define how the capability is executed.
COBIT Mapping: “Processes” + “Principles, Policies, and Procedures” components
What to Measure:
- Process documentation completeness
- Standardization across organizational units
- Process ownership and accountability
- Exception rates and process compliance
- Continuous improvement mechanisms
- Automation level
Example Assessment Statements:
- “Each process has a clear owner who is accountable for performance.”
- “Processes are documented and accessible to all stakeholders.”
- “Processes are monitored with success measures that are regularly reported.”
- “Processes for the capability are fully optimized and efficient.”
Technology & Infrastructure Dimension
Definition: The applications, platforms, tools, infrastructure, and technical services that enable and automate capability execution.
COBIT Mapping: “Services, Infrastructure, and Applications” component
What to Measure:
- Technology standardization and consolidation
- Automation coverage and maturity
- Platform scalability and performance
- Technical debt levels
- Integration quality and API maturity
- Technology refresh cycle adherence
Example Assessment Statements:
- “Technology platforms are enterprise-standard and supported.”
- “Automation is in place for routine tasks.”
- “Systems are integrated and data flows seamlessly.”
Information & Data Dimension
Definition: The quality, availability, governance, and effective use of data and information assets that support capability execution and decision-making.
COBIT Mapping: “Information” component
What to Measure:
- Data quality dimensions (accuracy, completeness, timeliness, consistency)
- Master data management maturity
- Metadata and data lineage coverage
- Data governance and stewardship effectiveness
- Information accessibility and usability
- Analytics and insights maturity
Example Assessment Statements:
- “Data is accurate, complete, and available when needed.”
- “Data ownership and stewardship roles are defined and active.”
- “Information flows support real-time decision-making.”
Governance & Culture Dimension
Definition: The decision rights, accountability frameworks, policies, organizational culture, ethics, and behavioral norms that shape capability performance and compliance.
COBIT Mapping: “Organizational Structures” + “Principles, Policies, and Procedures” + “Culture, Ethics, and Behavior” components
What to Measure:
- Decision-making clarity and speed
- Policy compliance and enforcement
- Risk management integration
- Audit findings and remediation
- Cultural alignment (e.g., security-first, customer-first)
- Ethical behavior and values adherence
Example Assessment Statements:
- “Decision rights and accountability are clearly defined (RACI).”
- “Policies are documented, communicated, and enforced.”
- “Culture supports continuous improvement and innovation.”
3.3 Maturity Assessment Using Enabling Dimensions
To assess capability maturity, architects score each enabling dimension (typically 1-5 scale), then aggregate:
Step 1: For each enabling dimension, respondents (subject matter experts) score agreement with 3-5 statements on a 5-point Likert scale (1 = Strongly Disagree, 5 = Strongly Agree).
Step 2: Calculate average score per dimension (e.g., “Process Dimension” average = 3.8).
Step 3: Calculate overall maturity as weighted or simple average across all five dimensions.
Example:
- People: 3.2
- Process: 4.0
- Technology: 3.5
- Information: 2.8
- Governance: 3.7
- Overall Maturity Score: 3.4 (Level 3: Defined)
4. Measures per Capability Dimension
For each capability dimension, organizations must define what to measure and why, distinguishing between qualitativeand quantitative measures.
4.1 Strategic Importance Measures
Measurement Type: Primarily qualitative (expert judgment) with quantitative validation where possible.
| Measure | What It Assesses | Data Source | Measurement Method |
| Strategic Alignment Score | Contribution to strategic objectives/OKRs | Strategy documents, executive interviews | Weighted scoring against strategic themes (1-5) |
| Business Model Criticality | Role in delivering core value propositions | Business model canvas, value stream maps | Essential/Important/Supporting classification |
| Competitive Differentiation | Uniqueness and competitive advantage potential | Market analysis, competitor benchmarking | Core/Supporting capability classification |
| Revenue/Value Contribution | Direct or indirect revenue/value generation | Financial analysis, attribution modeling | Percentage of revenue/value attributed |
| Future Opportunity Enablement | Enablement of new products, markets, business models | Innovation pipeline, roadmap analysis | Number of strategic initiatives enabled |
Why Measure: Ensures investment flows to capabilities with highest strategic leverage and competitive impact.
4.2 Capability Maturity Measures (Per Enabling Dimension)
Measurement Type: Qualitative (structured assessments) and Quantitative (metrics-based).
People Dimension Measures
| Measure | Qualitative Indicator | Quantitative Metric |
| Skill Coverage | “All required roles have skilled resources” | % of roles with certified/proficient staff |
| Competency Gaps | “Succession plans exist for critical roles” | Number of unfilled critical roles |
| Training Effectiveness | “Training programs are current and effective” | Training completion rate, certification rate |
| Capacity Utilization | “Team has adequate capacity” | Actual hours / planned hours |
Process Dimension Measures
| Measure | Qualitative Indicator | Quantitative Metric |
| Standardization | “Processes are standardized enterprise-wide” | % of units using standard process |
| Documentation | “Processes are fully documented and accessible” | % of processes documented |
| Ownership Clarity | “Each process has an accountable owner” | % of processes with assigned owners |
| Compliance | “Processes are consistently followed” | Process compliance rate, exception rate |
| Optimization | “Processes are continuously improved” | Cycle time reduction %, defect rate reduction |
Technology Dimension Measures
| Measure | Qualitative Indicator | Quantitative Metric |
| Standardization | “Enterprise-standard platforms are used” | % of applications on standard platforms |
| Automation | “Routine tasks are automated” | % of tasks automated, deployment frequency |
| Integration | “Systems are integrated seamlessly” | API coverage %, integration failure rate |
| Technical Debt | “Technical debt is managed proactively” | Technical debt ratio, code quality score |
| Scalability | “Technology scales to meet demand” | Performance under load, elasticity index |
Information Dimension Measures
| Measure | Qualitative Indicator | Quantitative Metric |
| Data Quality | “Data is accurate and complete” | Data quality score (accuracy, completeness, timeliness) |
| Governance | “Data ownership and stewardship are active” | % of data assets with assigned stewards |
| Accessibility | “Information is available when needed” | Data availability SLA, query response time |
| Metadata Coverage | “Data is cataloged with metadata” | % of data entities with complete metadata |
Governance Dimension Measures
| Measure | Qualitative Indicator | Quantitative Metric |
| Decision Clarity | “Decision rights are clearly defined” | Decision cycle time, escalation rate |
| Policy Compliance | “Policies are enforced consistently” | Policy compliance rate, audit findings |
| Risk Management | “Risks are identified and mitigated” | Risk mitigation coverage %, residual risk score |
| Cultural Alignment | “Culture supports capability objectives” | Employee engagement score, cultural survey results |
4.3 Adaptability Measures
Measurement Type: Primarily qualitative (expert assessment) informed by quantitative indicators.
| Measure | What It Assesses | Assessment Method |
| Environmental Flexibility | Ability to adapt to regulatory, competitive, economic changes | Expert scoring: ease of regulatory compliance updates, market pivot readiness |
| Customer Responsiveness | Ability to adjust to changing customer needs | Time to implement customer-requested changes, customization flexibility |
| Demand Scalability | Ability to scale up/down with demand fluctuations | Elasticity metrics, onboarding time for new capacity |
| Technology Modernization Readiness | Ease of technology refresh and upgrade | Technology debt level, modularity score, vendor lock-in risk |
| Organizational Change Readiness | Workforce and culture adaptability | Change fatigue index, training absorption rate |
Why Measure: High-value, low-adaptability capabilities require foundational transformation before incremental improvements; adaptability predicts transformation cost and duration.
4.4 Performance Measures
Measurement Type: Quantitative (metrics and KPIs).
Performance measures are capability-specific and aligned to capability outcomes. See Section 5 for KPI examples.
4.5 Complexity Measures
Measurement Type: Quantitative (calculated metrics).
| Complexity Type | What to Measure | Example Metrics |
| Structural | Component and relationship count | Number of applications, interfaces, dependencies |
| Technical | Technology heterogeneity | Number of technology stacks, programming languages, platforms |
| Organizational | Stakeholder and governance complexity | Number of stakeholders, decision layers, cross-functional dependencies |
| Data | Information flow complexity | Number of data entities, data sources, lineage hops |
| Process | Workflow and decision complexity | Process step count, decision points, exception paths |
Why Measure: Complexity correlates with transformation cost, risk, and duration—high complexity demands phased, incremental approaches.
4.6 Risk Measures
Measurement Type: Quantitative and qualitative.
| Risk Dimension | What to Measure | Example Metrics |
| Operational Risk | Single points of failure, resilience gaps | RTO/RPO compliance, redundancy coverage |
| Regulatory Risk | Compliance gaps | Number of open audit findings, compliance score |
| Cybersecurity Risk | Threat exposure | Vulnerability count, attack surface area, Mean Time to Detect/Respond |
| Competitive Risk | Capability gap vs. competitors | Maturity delta vs. industry benchmark |
5. KPIs per Capability Dimension
5.1 KPIs vs. Metrics: Critical Distinction
Metrics measure something; KPIs measure progress toward a strategic goal. KPIs are goal-oriented, SMART-structured, and drive decision-making.
| Characteristic | Metric | KPI |
| Purpose | Measure activity or state | Measure progress toward strategic goal |
| Link to Goals | May or may not link to goals | Directly linked to business objectives |
| Decision Impact | Informational | Actionable—triggers decisions |
| Example (Payment Processing) | “Transaction count” (metric) | “Payment processing error rate vs. 0.1% target” (KPI) |
Rule: If it helps achieve a key business goal, it’s a KPI; otherwise, it’s a metric.
5.2 KPIs for Strategic Importance Dimension
These are typically strategic-level KPIs used by executives for portfolio management.
| KPI | Definition | Leading/Lagging | Target Example |
| Capability Strategic Alignment Score | Weighted score of capability contribution to strategic objectives | Leading | ? 4.0/5.0 for Tier 1 capabilities |
| Revenue from Strategic Capabilities | % of revenue generated by capabilities scored “High” strategic importance | Lagging | ? 70% of revenue |
| Strategic Initiative Enablement Rate | % of strategic initiatives enabled by target capabilities | Leading | 100% of initiatives have required capabilities |
5.3 KPIs for Capability Maturity Dimension
Overall Maturity KPIs (aggregate across enabling dimensions):
| KPI | Definition | Leading/Lagging | Target Example |
| Average Capability Maturity Score | Mean maturity score (1-5) across assessed capabilities | Lagging | ? 3.5 (Defined to Quantitatively Managed) |
| Maturity Gap (Current vs. Target) | Number of maturity levels below target | Lagging | 0 gaps for strategic capabilities |
| Maturity Improvement Rate | Rate of maturity level increase per year | Leading | +0.5 levels/year for priority capabilities |
Enabling Dimension KPIs (examples per dimension):
People Dimension KPIs
| KPI | Leading/Lagging | Example |
| Critical Role Vacancy Rate | Leading | < 5% for critical roles |
| Certification Rate for Key Skills | Leading | ? 80% certified in required competencies |
| Employee Engagement Score | Leading | ? 4.0/5.0 for capability teams |
| Training Completion Rate | Leading | ? 90% completion within 90 days |
Process Dimension KPIs
| KPI | Leading/Lagging | Example |
| Process Standardization Rate | Leading | ? 90% of units using standard process |
| Process Compliance Rate | Lagging | ? 95% compliance to defined process |
| Process Cycle Time | Lagging | ? target cycle time (varies by process) |
| Defect/Error Rate | Lagging | ? 0.1% error rate |
| Process Exception Rate | Leading | ? 5% exceptions requiring manual intervention |
Technology Dimension KPIs
| KPI | Leading/Lagging | Example |
| Application Availability | Lagging | ? 99.9% uptime |
| Deployment Frequency | Leading | Daily for Agile/DevOps teams |
| Lead Time for Changes | Leading | ? 1 day from commit to production |
| Change Failure Rate | Lagging | ? 15% of deployments cause incidents |
| Mean Time to Recovery (MTTR) | Lagging | ? 1 hour for critical systems |
| Technical Debt Ratio | Leading | ? 5% of development capacity on debt remediation |
Information Dimension KPIs
| KPI | Leading/Lagging | Example |
| Data Quality Score | Lagging | ? 95% accuracy, completeness |
| Metadata Coverage | Leading | ? 90% of data entities cataloged |
| Data Availability SLA | Lagging | ? 99.5% availability during business hours |
| Master Data Accuracy Rate | Lagging | ? 98% accuracy for golden records |
Governance Dimension KPIs
| KPI | Leading/Lagging | Example |
| Policy Compliance Rate | Lagging | ? 95% compliance in audits |
| Decision Cycle Time | Leading | ? 5 days for standard decisions |
| Audit Findings (Open) | Lagging | 0 high-severity findings |
| Risk Mitigation Coverage | Leading | 100% of high risks mitigated |
5.4 KPIs for Adaptability Dimension
| KPI | Definition | Leading/Lagging | Target Example |
| Time to Implement Changes | Average time from change request to deployment | Lagging | ? 30 days for standard changes |
| Customization Flexibility Index | % of customer requests accommodated without architecture change | Leading | ? 80% of requests |
| Technology Refresh Cycle | Frequency of technology platform updates | Leading | Major refresh every 3-5 years |
5.5 KPIs for Performance Dimension (Domain-Specific Examples)
Business Capability Performance KPIs
| KPI | Leading/Lagging | Example |
| Customer Satisfaction (NPS) | Lagging | ? 50 NPS |
| Time-to-Market for New Products | Leading | ? 6 months |
| Market Share in Target Segments | Lagging | ? 20% market share |
| Revenue per Capability | Lagging | $X million/year |
Technology Capability Performance KPIs
| KPI | Leading/Lagging | Example |
| Infrastructure Utilization Rate | Lagging | 70-85% optimal utilization |
| Platform Scalability Index | Leading | Scales to 2x demand without performance degradation |
| Energy Efficiency per Workload | Lagging | ? X kWh per transaction |
Security Capability Performance KPIs
| KPI | Leading/Lagging | Example |
| Mean Time to Detect (MTTD) Threats | Leading | ? 15 minutes |
| Mean Time to Respond (MTTR) to Incidents | Lagging | ? 1 hour |
| Vulnerability Remediation Rate | Leading | ? 95% critical vulnerabilities patched within 7 days |
| Security Control Effectiveness Score | Lagging | ? 90% controls effective |
5.6 Leading vs. Lagging Indicators by Dimension
A balanced capability measurement system requires both leading (predictive, proactive) and lagging (historical, confirmatory) indicators:
| Dimension | Leading Indicators (Predictive) | Lagging Indicators (Historical) |
| Strategic Importance | Strategic initiative pipeline, innovation investment | Revenue contribution, market share |
| People | Training completion, certification rate, vacancy rate | Employee retention, engagement scores |
| Process | Process standardization rate, exception rate | Cycle time, defect rate, compliance rate |
| Technology | Deployment frequency, code quality, automation coverage | Availability, MTTR, change failure rate |
| Information | Metadata coverage, data stewardship assignment | Data quality scores, SLA compliance |
| Governance | Policy publication rate, risk mitigation plans | Audit findings, compliance scores |
| Performance | Customer engagement metrics, pipeline velocity | Revenue, NPS, market share |
Recommended Balance: 60-70% leading indicators, 30-40% lagging indicators to enable proactive management while maintaining accountability.
6. Cross-Domain Consistency: Universal vs. Domain-Specific Dimensions
6.1 Universal Dimensions (All Capability Domains)
The following dimensions apply consistently across business, technology, application, data, and security capabilities:
? Strategic Importance: All capabilities have strategic alignment
? Capability Maturity (and its five enabling dimensions):
- People & Competencies
- Process & Procedures
- Technology & Infrastructure
- Information & Data
- Governance & Culture
? Adaptability: All capabilities face change requirements
? Performance: All capabilities have measurable outcomes
? Complexity: All capabilities have structural, technical, organizational complexity
? Risk: All capabilities have operational, compliance, security risks
6.2 Domain-Specific Dimension Emphasis
While all dimensions apply universally, emphasis varies by domain:
| Domain | Primary Dimension Focus | Unique Considerations |
| Business | Strategic Importance, Performance (revenue, customer satisfaction) | Market differentiation, competitive advantage |
| Technology | Maturity (automation, standardization), Performance (availability, scalability) | Infrastructure resilience, technology lifecycle |
| Application | Maturity (reliability, maintainability), Adaptability (feature velocity) | Technical debt, integration quality |
| Data | Maturity (data quality, governance), Performance (accessibility, accuracy) | Master data management, data lineage |
| Security | Risk, Maturity (control effectiveness), Performance (MTTD, MTTR) | Threat landscape, compliance posture |
6.3 Reusability Across Use Cases
The same set of dimensions can be reused across multiple EA use cases:
| Use Case | Dimensions Used | Output |
| Capability Heat Maps | Strategic Importance (x-axis) + Maturity (y-axis) + Performance (color) | Prioritization quadrants |
| Maturity Assessments | Maturity (five enabling dimensions) | Maturity baseline, gap analysis |
| Investment Prioritization | Strategic Importance + Maturity Gap + Complexity + Risk | Weighted priority scores |
| Transformation Roadmaps | Maturity Gap + Adaptability + Complexity | Sequencing and phasing |
| Architecture Governance | Maturity + Risk + Complexity | Governance rigor levels |
7. Standards Mapping: Capability Dimensions to Frameworks
7.1 TOGAF Mapping
| TOGAF Concept | Capability Dimension Mapping | Details |
| Capability-Based Planning | Strategic Importance, Maturity, Adaptability | TOGAF recommends assessing capabilities along these three axes |
| Capability Increment | Maturity (phased improvement) | Discrete portions of capability that deliver incremental value |
| Architecture Vision (Phase A) | Strategic Importance | Aligns capabilities to business strategy |
| Business Architecture (Phase B) | All dimensions (comprehensive assessment) | Capability maps, heat maps, gap analysis |
| Gap Analysis | Maturity (current vs. target), Performance (current vs. target) | Identifies capability deficiencies |
| Migration Planning (Phase E-F) | Adaptability, Complexity, Risk | Informs sequencing and transition planning |
7.2 COBIT 2019 Mapping
| COBIT 2019 Component | Capability Dimension Mapping | Details |
| Seven Governance System Components | Five Enabling Dimensions (maturity sub-dimensions) | COBIT’s 7 components map to the 5 enabling dimensions: |
| 1. Processes | Process & Procedures Dimension | Step-by-step governance activities |
| 2. Organizational Structures | Governance & Culture Dimension | Decision-making entities |
| 3. Principles, Policies, Procedures | Process & Procedures + Governance Dimensions | Rules and guidance |
| 4. Information | Information & Data Dimension | Quality, availability, relevance |
| 5. Culture, Ethics, Behavior | Governance & Culture Dimension | Human element of governance |
| 6. People, Skills, Competencies | People & Competencies Dimension | Capability and expertise |
| 7. Services, Infrastructure, Applications | Technology & Infrastructure Dimension | Technological enablers |
| Performance Management (CPM) | Capability Maturity (0-5 levels), Performance (KPIs) | Capability assessment and measurement framework |
7.3 CMMI Mapping
| CMMI Concept | Capability Dimension Mapping | Details |
| Practice Areas | Capability domains (e.g., Process Management, Governance) | CMMI organizes practices into areas aligned to capabilities |
| Maturity Levels (1-5) | Capability Maturity Dimension | Initial, Managed, Defined, Quantitatively Managed, Optimizing |
| Process Performance Measures | Process Dimension KPIs | Cycle time reduction, defect rates |
| Organizational Performance | Performance Dimension KPIs | Project success rate, productivity |
7.4 NIST CSF 2.0 Mapping
| NIST CSF 2.0 Component | Capability Dimension Mapping | Details |
| Implementation Tiers (1-4) | Capability Maturity for Security Capabilities | Partial, Risk Informed, Repeatable, Adaptive |
| Functions (Govern, Identify, Protect, Detect, Respond, Recover) | Security capability domains | Top-level security capability categories |
| Tier Assessment Criteria | Five enabling dimensions (People, Process, Technology, Information, Governance) | Maturity assessed across organizational dimensions |
7.5 SAFe Mapping
| SAFe Concept | Capability Dimension Mapping | Details |
| Lean Portfolio Management | Strategic Importance, Performance, Risk | Capability-based investment and prioritization |
| WSJF (Weighted Shortest Job First) | Strategic Importance (Business Value + Time Criticality + Risk Reduction) / (Complexity) | Prioritization using capability dimensions |
| Business Agility Assessment | Maturity (across SAFe competencies) | Organizational capability maturity |
| Measure and Grow | Performance Dimension KPIs (Flow, Outcomes, Competency) | Capability performance metrics |
7.6 IT4IT Mapping
| IT4IT Component | Capability Dimension Mapping | Details |
| Value Streams (4) | IT capability domains | Strategy to Portfolio, Requirement to Deploy, Request to Fulfill, Detect to Correct |
| Functional Components | Capability building blocks | Specific IT capabilities within value streams |
| IT4IT KPIs | Performance Dimension KPIs per value stream | E.g., S2P: Business-IT Alignment, Service Portfolio Rationalization |
| Service Portfolio Management | Strategic Importance, Investment Level | Capability prioritization and rationalization |
8. Practical Application Guidance
8.1 Using Dimensions in Capability Assessments
5-Step Assessment Process:
- Select Capabilities: Identify 5-15 capabilities to assess (start with strategic priorities)
- Define Assessment Team: Assemble subject matter experts, capability owners, architects
- Score Dimensions: For each capability, score:
- Strategic Importance (1-5)
- Maturity across five enabling dimensions (1-5 per dimension, using Likert statements)
- Adaptability (1-5)
- Performance (against KPI targets)
- Complexity (calculated metrics)
- Risk (assessment scores)
- Aggregate & Analyze: Calculate overall scores, create heat maps, perform gap analysis
- Prioritize & Roadmap: Use dimension scores to prioritize improvement initiatives
8.2 Capability Heat Map Construction
Two-Dimensional Heat Map (most common):
- X-Axis: Strategic Importance (1-5)
- Y-Axis: Capability Maturity (1-5)
- Color: Performance (Green = meeting targets, Yellow = partial, Red = underperforming)
Interpretation:
- High Importance, Low Maturity (Top-Left Quadrant): Priority transformation candidates—invest immediately
- High Importance, High Maturity (Top-Right Quadrant): Sustain and optimize—maintain investment
- Low Importance, Low Maturity (Bottom-Left Quadrant): Monitor or divest—minimal investment
- Low Importance, High Maturity (Bottom-Right Quadrant): Rationalization candidates—reduce cost, consider shared services
8.3 Using Dimensions in Complexity Analysis
Complexity scores inform:
- Transformation Approach: High complexity ? phased, incremental; Low complexity ? rapid, big-bang
- Build vs. Buy: High technical complexity + low business specificity ? buy; High business specificity + moderate complexity ? build
- Risk Mitigation: Complexity score correlates with project risk—calibrate contingency buffers accordingly
8.4 Using Dimensions in Maturity Models
Maturity models assess capabilities across enabling dimensions using structured criteria (e.g., CMMI levels):
| Maturity Level | People | Process | Technology | Information | Governance |
| 1: Initial | Ad hoc staffing | Unpredictable | Minimal automation | Poor data quality | Reactive decisions |
| 2: Managed | Roles defined | Repeatable at project level | Some automation | Managed data quality | Planned governance |
| 3: Defined | Standard competencies | Standardized processes | Integrated platforms | Governed data | Proactive governance |
| 4: Quantitatively Managed | Competency metrics tracked | Process metrics drive decisions | Performance monitored | Data metrics tracked | Metrics-based governance |
| 5: Optimizing | Continuous learning culture | Continuous process improvement | AI/ML optimization | Data-driven insights | Innovation-driven governance |
8.5 Common Mistakes and Anti-Patterns
? Confusing Dimensions with Capabilities
Dimensions are assessment axes, not capabilities themselves. Don’t say “Our Process Dimension is strong”—say “Our Payment Processing capability has strong process maturity.”
? Over-Engineering Assessments
Avoid 50-question surveys per capability. Use 3-5 statements per enabling dimension (15-25 total) to balance rigor and practicality.
? Measuring Without Action
Assessment without improvement planning is waste. Every dimension assessment must inform a prioritized action (invest, sustain, divest, transform).
? Ignoring Adaptability
High strategic importance + low maturity + low adaptability = transformation disaster. Always assess adaptability before committing to aggressive timelines.
? All Lagging Indicators
Lagging-only KPIs create reactive management. Balance with 60-70% leading indicators to enable proactive adjustments.
8.6 Practical Example: Payment Processing Capability Assessment
Capability: Payment Processing
| Dimension | Score | Key Findings | Action |
| Strategic Importance | 5/5 | Core to revenue; regulatory criticality | Sustain high investment |
| Maturity | 2.8/5 | Low automation, inconsistent processes, siloed data | Priority transformation |
| – People | 3/5 | Skilled staff but high turnover risk | Succession planning |
| – Process | 2/5 | Not standardized across regions | Standardize processes |
| – Technology | 3/5 | Legacy platforms, minimal API integration | Platform modernization |
| – Information | 2/5 | Poor transaction data quality | Data governance program |
| – Governance | 3.5/5 | Clear ownership but weak controls | Strengthen controls |
| Adaptability | 2/5 | Hard-coded logic, vendor lock-in | Modularize architecture |
| Performance | Red | Error rate 2.5% (target 0.1%) | Immediate remediation |
| Complexity | 78/100 | High—15 systems, 8 data sources, 12 stakeholders | Phased transformation |
| Risk | High | Regulatory non-compliance risk, single points of failure | Risk mitigation plan |
Recommendation: Urgent Transformation Required—High strategic importance with low maturity, high complexity, and high risk. Launch phased modernization program with process standardization (Phase 1), platform consolidation (Phase 2), data governance (Phase 3).
9. Comprehensive Mapping Table: Dimensions to Standards
| Capability Dimension | TOGAF | COBIT 2019 | CMMI | NIST CSF | SAFe | IT4IT |
| Strategic Importance | Capability-Based Planning, Architecture Vision | Design Factors (Strategy alignment) | N/A | Governance objectives | WSJF (Business Value), Strategic Themes | S2P Value Stream (Portfolio alignment) |
| Capability Maturity | Architecture Maturity Models | Capability Maturity (0-5 levels) | Maturity Levels (1-5) | Implementation Tiers (1-4) | Competency Assessments | Maturity per value stream |
| – People | Workforce planning | People, Skills, Competencies | Workforce Empowerment (WE) | N/A | Teams & Technical Agility | Organizational structures |
| – Process | Process Architecture | Processes | Process Management (PCM) | N/A | Agile Product Delivery | Value stream processes |
| – Technology | Technology Architecture | Services, Infrastructure, Applications | Implementation Infrastructure (II) | N/A | DevOps, Cloud adoption | Functional components |
| – Information | Information Systems Architecture | Information | Data Management (DM), Data Quality (DQ) | N/A | Lean-Agile Budgets (transparency) | Information Model |
| – Governance | Architecture Governance | Org Structures, Principles/Policies, Culture | Governance (GOV) | Govern Function | Lean Portfolio Management | Governance integration |
| Adaptability | Migration Planning, Transition Architectures | Design Factors (Agility) | N/A | N/A | Business Agility, Organizational Agility | N/A |
| Performance | Value Realization, Architecture KPIs | Performance Management (CPM) | Process performance, Org performance | Function-specific metrics | Measure & Grow (Flow, Outcomes) | IT4IT KPIs per value stream |
| Complexity | Not explicitly defined | Not explicitly defined | Not explicitly defined | Not explicitly defined | Not explicitly defined | Not explicitly defined |
| Risk | Risk Management, Architecture Compliance | Risk management practices | Risk & Opportunity Management (RSK) | Identify, Assess functions | Risk-Adjusted Value | Risk management |
10. Conclusion and Recommendations
Summary
Capability dimensions provide the multi-faceted assessment framework essential for understanding, measuring, and managing enterprise capabilities. The three foundational dimensions—Strategic Importance, Capability Maturity, and Adaptability—combined with the five enabling dimensions (People, Process, Technology, Information, Governance) form a comprehensive, standards-aligned model applicable across all capability domains.
Key Takeaways:
? Dimensions ? Capabilities: Dimensions are the measurement axes; capabilities are the organizational abilities being measured
? Universal + Domain-Specific: Core dimensions apply everywhere; emphasis shifts by domain (business, technology, data, security)
? Measures ? KPIs: Define what to measure (qualitative + quantitative) per dimension, then derive goal-oriented KPIs
? Leading + Lagging Balance: 60-70% leading, 30-40% lagging indicators for proactive management
? Standards Alignment: Dimensions map directly to TOGAF, COBIT, CMMI, NIST CSF, SAFe, and IT4IT constructs
Recommended Actions
For Enterprise Architects:
- Adopt the Three Foundational Dimensions: Assess all strategic capabilities on Strategic Importance, Maturity, and Adaptability
- Use Five Enabling Dimensions for Maturity: Structure maturity assessments across People, Process, Technology, Information, Governance
- Build Heat Maps: Visualize Strategic Importance (x) × Maturity (y) × Performance (color) for portfolio prioritization
- Define Dimension-Specific KPIs: Establish 3-5 KPIs per dimension, balancing leading/lagging indicators
For Architecture Governance Boards:
- Standardize Dimension Framework: Mandate the use of this dimension model for all capability assessments
- Require Dimension-Based Business Cases: All transformation initiatives must demonstrate capability dimension improvements
- Track Maturity Evolution: Quarterly reassessment of strategic capability maturity scores
For CIOs and Business Leaders:
- Invest Based on Dimensions: Prioritize high strategic importance + low maturity + high adaptability capabilities for quick wins; phase high complexity transformations
- Monitor Dimension KPIs: Embed dimension-based KPIs into executive dashboards and quarterly business reviews
- Balance Portfolio: Ensure investment across all maturity dimensions—avoid over-indexing on technology while neglecting people, process, or governance
Framework Alignment: TOGAF, COBIT 2019, CMMI V3.0, NIST CSF 2.0, SAFe 6.0, IT4IT