Standard : Minimise data hops and transformation steps between source and insight
Purpose and Strategic Importance
This standard ensures that data pipelines are designed to minimise the number of intermediate hops and transformation steps between the original data source and the final insight consumers. Reducing data movement and processing layers accelerates delivery of insights and reduces latency and complexity.
It supports the policy “Shorten the Value Stream” by streamlining data flow, improving responsiveness, and lowering operational costs. Without this standard, excessive transformations introduce delays, increase errors, and reduce data freshness.
Strategic Impact
- Speeds up time-to-insight for business and operational decisions
- Reduces processing latency and resource consumption
- Decreases complexity and potential points of failure in data pipelines
- Enhances data quality by reducing transformation errors
- Supports scalable and maintainable data architectures
Risks of Not Having This Standard
- Increased latency in data delivery and insights
- Higher operational costs due to inefficient processing
- Greater risk of data inconsistencies and errors
- Complex pipelines that are hard to maintain and troubleshoot
- Reduced agility in responding to changing data needs
CMMI Maturity Model
Level 1 – Initial
| Category |
Description |
| People & Culture |
- Data pipelines have many unnecessary hops and transformations. |
| Process & Governance |
- No formal efforts to optimise or minimise data flow complexity. |
| Technology & Tools |
- Limited tooling to analyse or visualise data flow paths. |
| Measurement & Metrics |
- No metrics track pipeline latency or transformation counts. |
Level 2 – Managed
| Category |
Description |
| People & Culture |
- Some teams identify inefficiencies but lack consistent optimisation practices. |
| Process & Governance |
- Basic guidelines encourage simplification of data flow and transformations. |
| Technology & Tools |
- Tools support partial visualisation and monitoring of pipelines. |
| Measurement & Metrics |
- Some measurement of latency and transformation counts exists. |
Level 3 – Defined
| Category |
Description |
| People & Culture |
- Data flow optimisation is a consistent focus in pipeline design and operation. |
| Process & Governance |
- Formal processes govern pipeline complexity and performance optimisation. |
| Technology & Tools |
- Integrated platforms provide end-to-end monitoring and analytics. |
| Measurement & Metrics |
- Metrics inform continuous improvement of data flow efficiency and latency. |
Level 4 – Quantitatively Managed
| Category |
Description |
| People & Culture |
- Data-driven strategies optimise pipeline design for minimal hops and transformations. |
| Process & Governance |
- Pipeline metrics influence architectural decisions and operational priorities. |
| Technology & Tools |
- Advanced analytics predict bottlenecks and recommend design improvements. |
| Measurement & Metrics |
- Quantitative correlations link pipeline efficiency to business outcomes. |
Level 5 – Optimising
| Category |
Description |
| People & Culture |
- Continuous refinement of pipeline design driven by real-time data and feedback. |
| Process & Governance |
- Governance adapts dynamically to evolving data needs and technologies. |
| Technology & Tools |
- AI-assisted optimisation automates pipeline tuning and anomaly detection. |
| Measurement & Metrics |
- Organisational maturity in data flow design drives faster, reliable insights. |
Key Measures
- Average data latency from source to insight
- Number of transformation steps per data pipeline
- Resource consumption associated with data processing
- Data quality and error rates related to transformations
- User satisfaction with data freshness and reliability