• Home
  • BVSSH
  • C4E
  • Playbooks
  • Frameworks
  • Good Reads
Search

What are you looking for?

Standard : Change approval processes are lightweight, automated, and do not introduce delivery delay

Purpose and Strategic Importance

Change approval processes exist to reduce risk — but when they rely on manual gates, committee reviews, and synchronous sign-off, they become the primary bottleneck to delivery flow. This standard establishes that approval processes must be redesigned around trust, evidence, and automation rather than bureaucratic ceremony. High-trust teams with strong test coverage, observability, and automated quality gates do not need a Change Advisory Board to approve every deployment. The risk is managed earlier in the pipeline, not at the point of release.

Aligned to our "Engineering Excellence First" policy, this standard pushes organisations to replace heavyweight approval theatre with automated change classification, pre-approved change types, and peer-reviewed deployment processes backed by audit trails. The DORA change failure rate — not process compliance — becomes the primary indicator of whether the change process is working. Teams that adopt this standard deploy more frequently, with greater confidence, and with a measurable reduction in change-related incidents.

Strategic Impact

  • Eliminates synchronous approval bottlenecks that delay deployments by hours or days
  • Shifts risk management left into automated quality gates, tests, and observability rather than manual review
  • Enables continuous delivery by treating low-risk changes as pre-approved based on evidence
  • Creates auditable, automated approval trails that satisfy compliance requirements without slowing flow

Risks of Not Having This Standard

  • Manual CAB-style approvals create queues that delay time-to-production for every change, regardless of risk
  • Infrequent deployments due to approval overhead increase batch size, raising the true risk of each release
  • Engineers experience frustration and disengagement when bureaucratic processes slow high-quality work
  • Compliance obligations are met through documentation theatre rather than genuine risk controls
  • Teams optimise for fewer, larger releases to reduce approval overhead, directly contradicting continuous delivery principles

CMMI Maturity Model

Level 1 – Initial

Category Description
People & Culture Change processes are seen as a compliance obligation rather than a risk tool.
Process & Governance All changes require manual sign-off from a centralised CAB or equivalent body.
Technology & Tools Change requests are raised and tracked in spreadsheets or ticketing tools manually.
Measurement & Metrics Approval lead time and change failure rate are not tracked or reported.

Level 2 – Managed

Category Description
People & Culture Some teams begin questioning whether all changes require full CAB review.
Process & Governance A subset of standard change types is pre-approved, reducing some overhead.
Technology & Tools Change management tooling (e.g., ServiceNow) is used but still primarily manual.
Measurement & Metrics Time from change raised to approval is tracked for some change categories.

Level 3 – Defined

Category Description
People & Culture Teams understand the relationship between deployment frequency and change risk.
Process & Governance Change types are formally classified; low-risk changes follow automated approval paths.
Technology & Tools CI/CD pipelines generate change evidence automatically for audit and compliance purposes.
Measurement & Metrics Change failure rate (DORA) is tracked as the primary indicator of process effectiveness.

Level 4 – Quantitatively Managed

Category Description
People & Culture Teams are trusted to deploy autonomously when quality gates and evidence thresholds are met.
Process & Governance Automated approval is granted based on pipeline outcome, coverage thresholds, and risk scoring.
Technology & Tools Deployment pipelines emit structured audit logs that satisfy compliance requirements without manual review.
Measurement & Metrics Approval lead time approaches zero for pre-approved change types; change failure rate trends downward.

Level 5 – Optimising

Category Description
People & Culture Engineers treat every deployment as a routine, low-risk event underpinned by strong safety nets.
Process & Governance Approval processes are continuously refined based on failure data and delivery telemetry.
Technology & Tools Risk-based deployment gates dynamically adjust approval requirements based on real-time system state.
Measurement & Metrics Change failure rate and deployment frequency are industry-leading; approval overhead is negligible.

Key Measures

  • Change failure rate (DORA metric): percentage of deployments causing a production incident or rollback
  • Approval lead time: average time from change ready to approved and deployable
  • Percentage of changes classified as pre-approved or standard versus requiring manual review
  • Deployment frequency: number of production deployments per team per week
  • Mean time to restore (MTTR) following a failed change
  • Ratio of automated audit evidence generated per deployment versus manual change documentation raised
Associated Policies
  • Engineering Excellence First

Technical debt is like junk food - easy now, painful later.

Awesome Blogs
  • LinkedIn Engineering
  • Github Engineering
  • Uber Engineering
  • Code as Craft
  • Medium.engineering