Skills vs Capability
This distinction matters, and most organisations get it wrong.
A skill is something someone knows how to do. A developer who has read three books on distributed systems and completed a course on Kafka has skills in distributed systems. They understand the concepts. They can explain eventual consistency and describe the trade-offs between at-least-once and exactly-once delivery.
Capability is the consistent, demonstrated application of skill in context under realistic conditions. A developer with distributed systems capability has built and operated distributed systems. They have debugged production failures in distributed environments. They have made the trade-off decisions under real constraints - time pressure, political pressure, incomplete information - and they have made them well, consistently, over time.
The distinction is not pedantic. Organisations that conflate skill with capability end up promoting people who have demonstrable knowledge but insufficient track record of applying it. These promotions often fail - not because the person is not intelligent or knowledgeable, but because knowledge and capability are genuinely different things, and the gap becomes visible at the point of responsibility.
A capability framework is, at its core, a description of demonstrated behaviours - patterns of action that are observable, verifiable, and consistent. Not "understands system design principles" but "consistently produces system designs that teams can implement, that age well, and that demonstrate clear consideration of non-functional requirements."
This matters for fairness as much as accuracy. When capability is described in terms of observable behaviour, evidence becomes possible. When it is described in terms of potential, character, or personality, the assessment defaults to manager intuition - which is not equally distributed across demographic groups.
What a Capability Framework Is (and Isn't)
A capability framework is a structured description of what consistent, effective performance looks like at each level in an engineering career.
It is not a checklist. A checklist implies binary states - you either have a capability or you don't. Real capability is not binary. Someone might demonstrate strong technical design capability in familiar domains and struggle to apply it in genuinely novel contexts. That is not a pass/fail situation - it is a development insight. A framework should create space for nuance, not collapse it.
It is not a performance scorecard. A performance scorecard answers the question "is this person doing their job?" A capability framework answers the question "what level are they operating at?" These are related but distinct questions. Someone can be doing their current job well (good performance) while not yet operating at the next level (not yet ready for promotion). Someone can be clearly operating at the next level (ready for promotion) while having a rough quarter (temporary performance dip). Conflating these two things leads to bad decisions in both directions.
It is not a job description. Job descriptions describe a role. Capability frameworks describe the person - specifically, what their consistent patterns of behaviour look like when they are operating effectively at a given level. The same SSE capability framework applies whether the engineer is on a platform team, a product team, or a data engineering team. The technical content differs; the level of capability does not.
It is not static. What good looks like at a given level evolves as the organisation's technical context evolves. An SSE-level capability in a company that was doing web scraping five years ago looks different from SSE-level capability in a company doing distributed data processing at scale. The framework needs periodic review and calibration to stay meaningful.
What It Is
A capability framework is a shared language for talking about growth, contribution, and level. It makes the implicit explicit. It converts the question "is she senior enough?" from a gut-feel judgement into a structured conversation grounded in evidence.
When it is working, a capability framework does three things:
It enables honest development conversations. A manager can have a specific, evidence-based conversation about where someone is strong and where the gaps are. Not "you need to be more strategic" but "at the SSE level, we expect you to be identifying and driving significant technical improvements without being asked. In the past six months, I've seen you respond well to technical problems when they're raised, but I haven't seen you proactively identifying them. Let's talk about what's getting in the way."
It creates consistent calibration. When multiple managers assess engineers using the same framework, their assessments become comparable. Without a framework, one manager's "clearly ready for SSE" is another's "not quite there yet" - and the difference often has more to do with management style than engineer capability. With a framework, calibration conversations have traction.
It makes promotion evidence-based. A capability framework gives promoters and reviewers something concrete to point to. "Here are three specific examples of her operating at the SSE level. Here is the SSE capability description. The match is clear." This is a different conversation from "I think he's ready" - and it is a fairer one.
The Three Dimensions
Capability in engineering is not one-dimensional. A useful framework organises capabilities into dimensions that capture meaningfully different aspects of professional effectiveness.
Three dimensions that hold up across different organisational contexts:
Dimension 1: Technical Capability
Technical capability is what you can do with technology - your ability to design, build, and evolve software systems. This dimension is the most obvious and typically the easiest to assess, because technical work generates artefacts: code, designs, ADRs, technical proposals, system diagrams.
At early levels, technical capability is primarily about applying established patterns correctly in a defined context. At mid-levels, it is about adapting patterns to context and making good trade-off decisions. At senior levels, it is about defining technical direction, recognising where existing patterns are insufficient, and developing new approaches.
Technical capability is not just about writing code. It includes:
- Systems thinking - the ability to reason about a whole system, not just the component you are working on
- Design judgement - the ability to make good trade-offs between competing concerns
- Operational thinking - designing systems with observability, reliability, and operability in mind
- Technical communication - the ability to explain complex technical decisions clearly to different audiences
Dimension 2: Delivery Capability
Delivery capability is your ability to get things done, working with and through others, in a complex organisational environment. This dimension is frequently underweighted in capability frameworks because it feels less "technical" - but it is one of the most significant differentiators between engineers who create real impact and engineers who create impressive work that never ships.
At early levels, delivery capability is about executing defined work reliably - managing your own time, communicating blockers, shipping features that meet requirements. At mid-levels, it is about coordinating delivery across dependencies - managing technical dependencies, driving ambiguous pieces of work to resolution, navigating stakeholder complexity. At senior levels, it is about enabling delivery across teams - removing organisational friction, building practices that improve delivery for everyone, and making the hard calls that unblock progress.
Delivery capability includes:
- Planning and estimation - the ability to decompose complex work and track progress honestly
- Risk management - identifying delivery risks early and addressing them proactively
- Stakeholder communication - keeping the right people informed without creating noise
- Decision-making under uncertainty - moving forward effectively when information is incomplete
Dimension 3: People and Organisational Capability
People and organisational capability is your ability to work effectively with others and to contribute to the health of the organisation beyond your immediate team. This dimension is the one most likely to be absent from technically-focused capability frameworks, and its absence explains a lot of promotion failures.
At early levels, this dimension is about fitting in well - being collaborative, receptive to feedback, contributing positively to team culture. At mid-levels, it is about contributing actively - mentoring more junior engineers, giving useful feedback, helping build team practices. At senior levels, it is about shaping the organisation - building communities of practice, influencing engineering culture, developing other engineers' capability at scale.
People and organisational capability includes:
- Mentoring and coaching - the ability to develop others' capability through challenge and support
- Influence without authority - the ability to drive change through credibility and relationship, not positional power
- Feedback - both giving feedback that is honest and specific and receiving it openly
- Organisational awareness - understanding how the organisation works and navigating it effectively
How the Dimensions Evolve
The proportional emphasis on these dimensions shifts across levels in a predictable pattern:
| Level | Technical | Delivery | People/Org |
|---|---|---|---|
| GE | 70% | 25% | 5% |
| JSE | 60% | 30% | 10% |
| ISE | 50% | 30% | 20% |
| SSE | 40% | 30% | 30% |
| LSE | 35% | 25% | 40% |
| TTL | 30% | 30% | 40% |
| EM | 10% | 35% | 55% |
| HoE | 5% | 25% | 70% |
These are not rigid prescriptions - they are directional signals. The point is that career progression in engineering is not just about becoming technically more capable. It is about expanding the scope and nature of your contribution across all three dimensions.
SFIA Alignment
SFIA - the Skills Framework for the Information Age - is an industry-standard competency framework for IT and digital professionals. It defines seven levels of responsibility and provides hundreds of skill descriptors that map to those levels. It is widely used in UK government and public sector, and increasingly in commercial engineering organisations.
SFIA is worth understanding for two reasons. First, it provides a well-tested vocabulary for describing capability levels that has been validated across thousands of organisations. Second, alignment with SFIA makes your framework legible to the broader market - useful for benchmarking, hiring, and professional development conversations.
The Seven SFIA Levels
| SFIA Level | Label | Characteristic |
|---|---|---|
| 1 | Follow | Works under close supervision; applies defined methods |
| 2 | Assist | Works under routine supervision; uses judgement within defined limits |
| 3 | Apply | Works under general direction; exercises some autonomy |
| 4 | Enable | Works under broad direction; significant autonomy; influences others |
| 5 | Ensure & Advise | Accountable for significant outcomes; advises at senior level |
| 6 | Initiate & Influence | Sets strategy; influences at organisational level |
| 7 | Set Strategy | Leads at the highest organisational level; shapes industry direction |
Using SFIA Without Bureaucracy
SFIA can become bureaucracy very quickly. The full SFIA catalogue contains hundreds of skill entries, each with descriptions at multiple levels. Implementing the full catalogue is a project that typically consumes months, produces a document nobody reads, and gets abandoned during the next reorganisation.
The productive way to use SFIA is selective alignment - map your role levels to SFIA levels, use SFIA's level descriptors as a quality check on your own capability descriptions, and reference specific SFIA skills where they add precision to your framework without trying to cover the whole catalogue.
SFIA is most valuable as a calibration tool. When you are unsure whether your ISE-level description is calibrated correctly, compare it to SFIA Level 3 (Apply). When you are uncertain whether your SSE description is appropriately differentiated from ISE, check it against SFIA Level 4 (Enable). The SFIA level descriptors act as a reference point that prevents your framework from drifting in either direction.
Mapping Capabilities to Your Role Archetypes
The following table maps internal role levels to SFIA levels and describes the primary capability emphasis at each level. This is a starting point for calibration, not a definitive specification.
| Level | SFIA Level | Technical Capability | Delivery Capability | People/Org Capability |
|---|---|---|---|---|
| GE | 1-2 | Applies known patterns with guidance; learns from code review | Delivers defined tasks; communicates blockers promptly | Participates constructively; receptive to feedback |
| JSE | 2-3 | Implements features within established architecture; contributes to technical discussions | Delivers independently within a sprint; manages own blockers | Collaborates well; beginning to share knowledge with peers |
| ISE | 3-4 | Owns component design; makes good trade-off decisions; contributes to architectural conversations | Drives features from ambiguity to delivery; manages technical dependencies | Mentors graduates; gives substantive code review; influences team practices |
| SSE | 4-5 | Sets technical direction for significant system decisions; identifies and addresses systemic technical problems | Drives complex initiatives across team boundaries; manages senior stakeholders | Actively develops others; shapes team engineering culture; builds practices |
| LSE | 5-6 | Sets domain-level technical strategy; recognised technical authority; engages with industry practice | Drives organisational-level technical programmes; builds capability for sustained delivery | Develops SSEs; shapes engineering practice across multiple teams; builds communities |
| TTL | 4 | Maintains strong technical authority within team; makes key technical decisions | Manages team delivery rhythm; drives sprint and quarterly execution | Manages direct reports; runs ceremonies; develops team members' capability |
| EM | 4-5 | Provides technical oversight; does not make day-to-day technical decisions | Accountable for team delivery outcomes; manages cross-team dependencies | Develops engineers; handles performance management; builds team culture |
| HoE | 5-6 | Sets engineering strategy; accountable for technical direction across domain | Accountable for departmental delivery health | Develops EMs; shapes engineering culture at scale; represents engineering in leadership |
How to Use the Framework in Practice
A capability framework that lives in a document and is referenced only at promotion time has failed. A capability framework that is part of the weekly conversation between manager and engineer is working.
In 1:1s
The framework should inform but not dominate 1:1 conversations. A useful monthly practice: pick one capability dimension and one piece of recent work, and use them as the basis for a structured conversation. Not "how are you doing against the framework?" but "let's look at the technical design you led last week - what was working well, where did you find it difficult, and what would you do differently?"
The goal is to make the framework a lens for reflecting on real work, not a scorecard to fill in.
In Promotion Conversations
When preparing a promotion case, the framework should be the structure. For each capability dimension, gather specific, concrete examples of the engineer operating at the target level. Not "she demonstrates strong delivery capability" but "she led the migration of our authentication service from a legacy system with three external dependencies, coordinated delivery across two teams, managed a six-week timeline, and shipped it with no downtime and one minor rollback that she had already rehearsed."
Evidence should be:
- Specific: a real thing that happened, not a general tendency
- Impact-linked: connected to an outcome that mattered
- Consistent: ideally, multiple examples showing a pattern rather than a single standout moment
- Independently verifiable: not just manager assessment, but something peers and stakeholders can corroborate
In Calibration
Calibration is where the framework earns its keep. When managers discuss engineers across teams, the framework provides a common vocabulary. "I think she's operating at ISE level on technical capability, but her delivery capability is still JSE - she's not yet managing her own dependencies effectively" is a calibratable statement. "She seems ready for the next level" is not.
Calibration should be a structured conversation, not a ranking exercise. The goal is to identify where managers are applying the framework consistently and where there are divergences - and to understand why.
Common Failures
The 50-Competency Grid
If your capability framework has more than 12-15 capability dimensions in total (across all three dimensions at all levels), it is too complex to use in practice. The design instinct that produces a 50-competency grid is understandable - thoroughness feels like rigour. But thoroughness that produces a document nobody engages with is not rigorous. It is comfort-seeking.
The test of a capability framework is not whether it is comprehensive. It is whether managers and engineers use it in real conversations about real work. Simplicity enables use. Complexity prevents it.
Competencies That Are the Same at Every Level
The single most common failure in capability framework design is writing competencies that are identical at every level with different adjectives attached. "Communicates clearly (GE) → Communicates effectively (JSE) → Communicates with impact (ISE) → Communicates strategically (SSE)" is not a progression framework. It is a thesaurus exercise.
A real progression in communication capability might look like:
- GE: Communicates status and blockers within the team promptly and accurately
- JSE: Writes clear technical documentation that teammates can use without clarification
- ISE: Adapts technical communication to different audiences; can explain complex decisions to non-technical stakeholders
- SSE: Shapes the team's technical narrative; communicates architectural direction in ways that build alignment
- LSE: Influences technical direction across the organisation through written and verbal communication; contributes to external technical discourse
Each level describes genuinely different behaviour, not the same behaviour described more impressively.
Capability Frameworks Used as Gotchas
A capability framework used to deny promotions or justify underperformance management without genuine development is being misused. The framework should be a development tool first. When a manager says "you haven't demonstrated SSE-level delivery capability," the follow-up should be: "Here's what that would look like, here's how we can create the opportunity for you to demonstrate it, and here's when we'll check in on progress." Not "so you're not getting promoted this cycle."
Frameworks That Don't Evolve
A capability framework created in 2019 for a team doing Java microservices does not accurately describe the capability required in 2024 for a team doing real-time data pipelines and ML feature engineering. Frameworks should be reviewed annually and updated when the technical context changes materially. The core level structure is stable; the specific technical capability descriptors are not.
Connection to Your Operating Model
The capability framework is the detail layer under the career pathway. The pathway says "this is what SSE looks like at the level of scope and autonomy." The capability framework says "this is specifically what SSE technical capability looks like, what SSE delivery capability looks like, and what SSE people/org capability looks like."
Together they answer the question: what does good look like at this level?
The capability framework connects directly to:
Career conversations - the framework provides the vocabulary for honest conversations about where someone is and where the gaps are. Without it, these conversations are vague. With it, they are specific and actionable.
Promotion and levelling - the framework is the evidence structure for promotion cases. It turns promotion from a judgement call into a structured assessment.
Learning and development - capability gaps identified through the framework become the basis for development plans. Not "develop your communication skills" but "build the ISE-level technical communication capability by leading the documentation for the next significant system change."
Role archetypes - the framework applies consistently across archetypes, even as the specific technical content varies. A senior backend engineer and a senior data engineer should both be operating at SSE-level capability - the capability framework describes what that means in a way that is archetype-agnostic, while the role archetypes describe what the technical content looks like in each specialisation.