• Home
  • BVSSH
  • C4E
  • Playbooks
  • Frameworks
  • Good Reads
Search

What are you looking for?

Practice : AI Use Case Discovery

Purpose and Strategic Importance

Not every problem is an AI problem. The history of enterprise AI is littered with initiatives that applied machine learning to problems that could have been solved more reliably, cheaply, and quickly with simpler approaches — or that were not real problems at all, but solutions in search of a problem. AI Use Case Discovery is the practice of identifying and prioritising AI opportunities based on genuine business and user need, feasibility, and strategic fit — before any investment in development is made.

Good use case discovery also builds the internal credibility of AI teams. Teams that repeatedly deliver AI systems solving real problems — because they spent time validating the problem before building the solution — establish a track record of impact. Teams that skip discovery and build AI solutions for hypothetical problems burn credibility and budget, making the next legitimate AI initiative harder to fund and resource.


Description of the Practice

  • Engages business and domain stakeholders in structured discovery conversations to surface high-value problems where AI could provide a meaningful advantage over current approaches.
  • Evaluates candidate use cases against a consistent prioritisation framework considering business value, feasibility, data availability, risk, and strategic alignment.
  • Produces a ranked use case backlog that is visible to leadership and regularly refreshed as business priorities and AI capability evolve.
  • Documents the specific problem statement, current state, and hypothesis for AI improvement for each prioritised use case before any development work begins.
  • Distinguishes clearly between use cases where AI provides genuine advantage and those where simpler automated solutions — rules, heuristics, or conventional software — would be more appropriate.

How to Practise It (Playbook)

1. Getting Started

  • Run a discovery workshop with stakeholders from key business domains to identify their most significant pain points, manual processes, and decision-making bottlenecks — sources of potential AI value.
  • Evaluate initial candidate use cases against a lightweight prioritisation framework: estimated business impact, data availability, technical feasibility, and risk level.
  • Select two or three high-confidence use cases for deeper investigation, rather than committing to a large portfolio that exceeds the team's capacity to validate and deliver.
  • Document each candidate use case with a clear problem statement, current solution, and hypothesis for how AI would improve it — avoiding vague statements about applying AI to a domain.

2. Scaling and Maturing

  • Build a structured use case pipeline with defined stages — ideation, qualification, deep investigation, validation, and development — so that the status of every candidate use case is visible and trackable.
  • Develop capability maps that show which AI capabilities (classification, generation, recommendation, anomaly detection) address which classes of business problem, enabling faster qualification of new candidates.
  • Establish regular use case review sessions that include both technical and business stakeholders, ensuring the use case backlog reflects current business priorities and AI team capacity.
  • Track the success rate of use cases that proceed to development, using outcome data to improve the quality of discovery and prioritisation over time.

3. Team Behaviours to Encourage

  • Start with the problem, not the technology — every discovery conversation should begin with "what is the problem?" not "where can we apply AI?".
  • Be willing to recommend non-AI solutions when they are more appropriate — the team's credibility depends on being honest about fit, not on always recommending AI.
  • Involve end users — the people who will work with or be affected by the AI system — in use case discovery, not just the sponsors who commission AI work.
  • Document discovery decisions including use cases that were considered and rejected and the reasons why, building institutional knowledge of the decision-making process.

4. Watch Out For…

  • Use case discovery driven by technology availability — "we have this AI capability, what can we use it for?" — rather than by business need.
  • Discovery processes that are too short, producing a list of ideas rather than genuinely investigated and validated use cases with clear problem statements and success criteria.
  • Use cases that are framed at too high a level — "apply AI to customer service" — without specifying the specific decision, prediction, or recommendation that AI will improve.
  • Sponsor bias in discovery that systematically surfaces use cases that benefit powerful stakeholders while missing opportunities for AI to improve outcomes for frontline workers or end customers.

5. Signals of Success

  • The AI use case backlog is a living, prioritised list that is reviewed and updated regularly, not a static wishlist created at the outset of the AI programme.
  • A meaningful proportion of use cases investigated during discovery are rejected or deferred, demonstrating that discovery is a genuine quality gate rather than a rubber-stamp process.
  • Use cases that proceed to development are clearly articulated with specific problem statements, success criteria, and data availability assessments — not vague aspirations.
  • Business stakeholders actively participate in discovery processes and report that the process helps them articulate and prioritise their AI needs more clearly.
  • The AI systems developed through this discovery process have measurably higher success rates than previous AI initiatives, validating the investment in upfront discovery.
Associated Standards
  • AI use cases are selected based on validated business impact
  • AI investment decisions are informed by value realisation data
  • AI systems deliver measurable improvement over non-AI alternatives

Technical debt is like junk food - easy now, painful later.

Awesome Blogs
  • LinkedIn Engineering
  • Github Engineering
  • Uber Engineering
  • Code as Craft
  • Medium.engineering