Industry Research

Your engineering isn't the problem. The problem is the problem.

We're building the first industry-specific dataset on how product development teams in high-risk sectors validate problem definitions before committing resource. Not a marketing exercise; a real research gap that published studies haven't touched. Your experience fills it.

3 minutes, 10 questions Anonymised findings shared back to you See how your sector compares
95% of new products fail to meet their commercial targets
£88M wasted per £1B spent on projects with poorly defined problems
42% of startup failures attributed to building something nobody needs
0 standalone tools exist to score a problem definition before committing resource

These numbers come from independent research (Christensen/MIT, CB Insights, PMI). But they're domain-general. Nobody has measured this in high-risk industries specifically; the ones where getting it wrong costs lives, certifications, or millions. Your experience fills that gap.

10 Questions

What does your organisation actually do when the problem is wrong?

Every question here targets a specific gap in published research. No padding; no filler.

Q1 PERSONA VALIDATION

What best describes your role?

Q2 PROBLEM EXISTENCE

In the last 3 years, have you seen a product programme fail or pivot significantly because it was aimed at the wrong problem?

Q3 DETECTION PATTERN

When did the "wrong problem" realisation hit?

Select all that apply

Q4 CURRENT WORKAROUNDS

How does your organisation currently validate problem definitions before committing resource?

Select all that apply

Q5 GOVERNANCE EFFECTIVENESS

How confident are you that your current process catches a bad problem definition before significant resource is committed?

Q6 SPEED THRESHOLD

If an external audit could score your problem definition against evidence and flag gaps; what's the longest you'd wait for it?

Q7 WILLINGNESS TO PAY

Would you pay for a standalone problem definition audit; separate from a full design consultancy engagement?

Q8 DATA TRUST

Would you share internal project data (problem briefs, requirements docs) with an external scoring tool?

Q9 LANGUAGE FIT

If you were searching for help validating a problem definition, which term would you search for?

Pick up to 2

Q10 OPEN SIGNAL

What's the most expensive "wrong problem" mistake you've witnessed?

Optional. One or two sentences is perfect. This is the evidence we can't find in published research.

OPTIONAL

Get the findings back

Leave your email to receive the anonymised research report when it's published. We won't use it for anything else.

Survey responses are anonymised and used for research purposes only (lawful basis: legitimate interest). Contact details (name, company, job title, LinkedIn, email) are collected only when you opt in and used solely to deliver findings, send product updates, or vet beta access; whichever you select (lawful basis: consent). Beta access is vetted individually; we may decline applications without giving a reason. All data is stored securely by ProblemSmith Ltd (UK). Survey data is deleted after 12 months; contact details are retained until you withdraw consent. To request deletion or withdraw consent at any time, email [email protected]. Full privacy notice.

Takes 30 seconds. All responses anonymised.

Noted.

Your experience just filled a gap that published research hasn't touched. We'll send the anonymised findings once there's enough data to draw real conclusions.

Back to problemsmith.com

The findings come back to you

This is not a marketing exercise dressed as research. We're building a dataset that does not exist yet; how product design, development and engineering teams in high-risk industries actually handle problem definition. Everyone who contributes gets the results.

  • Anonymised benchmark data: how your sector compares on problem definition governance
  • The most common "wrong problem" patterns and when they're typically caught
  • What workarounds teams are using and how effective they report them to be
  • The gap map: where current processes fail, backed by your industry's own data

This research is led by Lee Smith; 24 years in high-risk product development across industries where the cost of getting things wrong is high. It feeds a broader investigation into why product programmes fail at the problem definition stage, and what structured tools could prevent it. Learn more about Problemsmith.