• Home
  • BVSSH
  • Engineering Enablement
  • Playbooks
  • Frameworks
  • Good Reads
Search

What are you looking for?

Standard : Code Coverage

Description

Code Coverage measures the percentage of your source code that is executed during automated test runs. It’s a common proxy for test completeness and is useful for identifying untested or high-risk areas of the codebase.

While high code coverage doesn't guarantee quality, low coverage often indicates poor testability, hidden bugs, or risky changes. This metric helps teams prioritise test investments and build confidence in their safety net.

How to Use

What to Measure

  • The percentage of lines, functions, or branches exercised by tests during automated runs.
  • Break down by module, service, or repository for actionable insights.

Formula

Code Coverage (%) = (Covered Code / Total Code) x 100

Instrumentation Tips

  • Use tools like Jacoco, Istanbul, Cobertura, or Codecov in CI pipelines.
  • Monitor code coverage at PR level, main branches, and release builds.
  • Visualise trends over time and set thresholds for enforcement or alerts.

Why It Matters

  • Confidence: Indicates which parts of the code are exercised and which are exposed.
  • Test Strategy: Highlights areas needing better test coverage or design changes.
  • Maintainability: High-coverage systems are easier to refactor safely.
  • Governance: Helps ensure teams meet agreed testing standards.

Best Practices

  • Focus on meaningful coverage—not just numbers. Prioritise critical paths and edge cases.
  • Include coverage checks in PR pipelines and quality gates.
  • Use mutation testing to validate the effectiveness of coverage.
  • Track branch and condition coverage, not just lines.
  • Invest in coverage for legacy or high-change areas to reduce risk.

Common Pitfalls

  • Chasing 100% coverage as a goal rather than focusing on value.
  • Writing tests that add coverage but don’t assert behavior.
  • Ignoring untestable or unreachable code.
  • Relying on coverage alone without tracking test reliability or outcomes.

Signals of Success

  • Coverage is stable or increasing on core systems and critical paths.
  • Code changes are accompanied by tests that meaningfully improve coverage.
  • Teams use coverage data in planning and retrospectives.
  • Fewer regressions and bugs slip past automated tests.

Related Measures

  • [[Automated Test Pass Rate]]
  • [[Change Failure Rate]]
  • [[Lead Time for Change]]
  • [[Test Suite Reliability]]
  • [[Code Quality Score]]

Technical debt is like junk food - easy now, painful later.

Awesome Blogs
  • LinkedIn Engineering
  • Github Engineering
  • Uber Engineering
  • Code as Craft
  • Medium.engineering