• Home
  • BVSSH
  • C4E
  • Playbooks
  • Frameworks
  • Good Reads
Search

What are you looking for?

Standard : AI work is recognised and celebrated as a team achievement

Purpose and Strategic Importance

This standard requires that AI delivery achievements — including shipped models, resolved incidents, failed experiments that produced learning, and cross-functional contributions — are recognised and celebrated at a team level rather than attributed solely to individual contributors or senior champions. It supports the policy of sharing AI knowledge openly across the organisation by creating a recognition culture that motivates sharing, collaboration, and collective ownership. Teams that feel their work is invisible or unattributed disengage, hoard knowledge, and ultimately deliver less.

Strategic Impact

  • Reinforces the collaborative, multi-disciplinary working culture that AI systems require to be built and sustained effectively
  • Motivates knowledge sharing by making the act of contributing to collective success visible and valued
  • Reduces the knowledge concentration risk that arises when only high-profile individual contributions are recognised
  • Creates positive momentum around AI delivery that sustains team energy through the inevitable difficult phases of complex projects
  • Signals to the wider organisation that AI work is meaningful, valued, and worthy of investment, supporting talent attraction and internal mobility

Risks of Not Having This Standard

  • Individual star culture emerges, concentrating knowledge and creating unhealthy competition rather than collaboration
  • Team members who contribute essential but less visible work (data quality, infrastructure, testing, ethics review) feel undervalued and disengage
  • Failed experiments that produced valuable learning are hidden rather than shared because failure is not a celebrated outcome
  • The organisation loses institutional knowledge when unrecognised contributors leave without sharing what they have learned
  • AI teams become isolated from the rest of the organisation when their work is not made visible and connected to wider outcomes

CMMI Maturity Model

Level 1 – Initial

Category Description
People & Culture - Recognition for AI work is informal and inconsistent; visible outputs (demos, launches) attract attention while foundational contributions go unacknowledged
Process & Governance - No formal recognition process; celebration of AI achievements depends on individual manager behaviour
Technology & Tools - No tooling or channels dedicated to recognising AI team contributions
Measurement & Metrics - Team morale and recognition satisfaction are not measured; disengagement is discovered through attrition

Level 2 – Managed

Category Description
People & Culture - Team leads make an effort to acknowledge AI contributions in team meetings and retrospectives
Process & Governance - AI project milestones include a team recognition moment; sprint reviews include a segment celebrating delivered value
Technology & Tools - A team communication channel is used to share AI wins and learning; team members are encouraged to post positive recognition
Measurement & Metrics - Recognition frequency is informally monitored by team leads; team satisfaction is discussed in retrospectives

Level 3 – Defined

Category Description
People & Culture - Recognition norms are explicitly defined; contributions across all disciplines (data, engineering, ethics, product) are acknowledged, not just model performance results
Process & Governance - A formal recognition programme for AI work is in place; learning from failed experiments is celebrated alongside successful deployments
Technology & Tools - Showcases, demo days, and internal publications give AI teams a platform to share their work with the wider organisation
Measurement & Metrics - Team members rate recognition satisfaction in quarterly surveys; results are reviewed by leadership and inform management practice

Level 4 – Quantitatively Managed

Category Description
People & Culture - Recognition culture is measured as a component of team health; low recognition scores trigger structured management interventions
Process & Governance - Recognition programme effectiveness is reviewed annually; formats are adapted based on team feedback
Technology & Tools - Peer recognition platforms track the frequency and breadth of AI team member recognition; data surfaces teams where recognition is concentrated or absent
Measurement & Metrics - Recognition breadth (proportion of team members receiving recognition per quarter), frequency, and correlation with team health and attrition metrics are tracked

Level 5 – Optimising

Category Description
People & Culture - AI team achievements are shared externally through conference presentations, blog posts, and open source contributions, creating pride and motivation that recruitment cannot manufacture
Process & Governance - Recognition standards are continuously refined based on team feedback and emerging evidence on what recognition formats are most motivating for AI practitioners
Technology & Tools - AI knowledge sharing is embedded in the performance framework; sharing and recognition contributions are valued alongside technical delivery in career conversations
Measurement & Metrics - External recognition (publications, conference invitations, industry citations) is tracked as a signal of the organisation's AI culture and community standing

Key Measures

  • Percentage of AI team members who received at least one formal recognition per quarter
  • Team recognition satisfaction score measured in quarterly team health surveys
  • Number of AI learning shares (including failed experiments) presented at internal showcases or published per quarter
  • Voluntary AI team attrition rate and recognition cited as a positive or negative factor in exit interview data
  • External recognition events (publications, conference presentations, open source contributions) per team per year
Associated Policies
Associated Practices
  • AI Retrospectives
  • Cross-Functional AI Team Design
  • AI Knowledge Sharing and Demos
  • Inner-Source for AI Components

Technical debt is like junk food - easy now, painful later.

Awesome Blogs
  • LinkedIn Engineering
  • Github Engineering
  • Uber Engineering
  • Code as Craft
  • Medium.engineering