After talking with product managers and designers, I’ve noticed a risky belief. In the name of being “objective”, we scramble for metrics that validate our design solutions. This approach is wrong: until a design ships and users interact with it, no data can prove it works.
Existing data only reflects current or past user experience. It cannot predict how users will respond to something new.
Data is Not a Crystal Ball
Data reveals what has already happened. It reviews the past and challenges incorrect assumptions, but it cannot predict future outcomes.
For example, using current conversion rate to guarantee a new onboarding flow will succeed overlooks a basic truth: design changes alter user behaviour.
Another pitfall is becoming strictly “data-driven”. In practice, this means reacting to numbers without understanding their underlying causes. If your interpretation is wrong, your strategy goes off course.
Instead, I lean toward these two approaches:
- Data-informed: Using data to assess the current landscape.
- Data-inspired: Synthesising multiple data points to map the problem space and spark new ideas.
In both approaches, data doesn’t provide the answer. It helps us to ask better questions and ground our discussion in reality.
Don’t Use Data to Back Up Bias
Another dangerous misuse of data is selective interpretation.
Teams cherry-pick metrics that support their favoured solution, ignore conflicting signals and mentally extrapolate conclusions that the data never actually supports.
Look at this example:
Users who adopted the new feature show higher retention, therefore the feature succeeds.
But the team never examined:
- Were these just the most engaged users?
- Why did no one else tried the feature?
This is confirmation bias. Data should be a tool for investigation, not validation.
If Data Can’t Justify a Design, What Can?
How do we confidently ship a design if we cannot “prove” it in advance with data?
Causal reasoning.
When someone challenges a design proposal, the goal isn’t to generate endless variations, but to clearly articulate the logic behind it:
The problem is A. Change B directly addresses A, and here’s why.
Defending a decision through reasoned logic, rather than “gut feeling” or “past data”, is what distinguishes senior designers.
The Shortest Causal Chain Wins
I use one principle when evaluating design solutions:
Prioritise the solution with the shortest causal chain and the fewest assumptions.
This means you:
- Address the root cause directly
- Reject solutions built on “stacked” assumptions
- Avoid “solution-first” thinking—no random feature dumping just to show progress
E.g. if users can’t find the next step, enhancing the CTA visibility is more direct and measurable than adjusting colours, adding animations or rewriting all the copy.
Reinforcing Your Logic with Evidence
Once your causal logic is solid, strengthen it with supporting evidence:
- Qualitative research: Conduct interviews and user sessions to uncover intent and mental models.
- Visual evidence: Analyse heatmaps, click maps or eye-tracking to identify usability friction.
- Benchmarks: Refine flows based on how industry leaders handle similar frictions (draw from Baymard Institute or Mobbin).
- Business alignment: Validate the solution directly impacts specific KPIs, not indirect or vague goals.
Iteration Beats Perfection
Design rarely succeeds on the first attempt. This is why risk management and iteration are crucial.
For large products:
- Deploy A/B tests or prototypes to catch fatal flaws early
- Release MVPs or test versions to a limited audience with clear expectations
When shipping for real:
- Maintain an MVP-first approach
- Constrain scope and avoid releasing everything at once
- Roll out changes incrementally and validate them one by one
Introducing too many changes increases risk and makes it impossible to tell what actually worked.
Only after launch does the real design work begin, where real user behaviour and real data test your assumptions. Fast iteration means that even if you’re wrong, you can correct it quickly.
The Bottom Line
In the end, product and design decisions are never made by data alone. In innovative spaces where no clear references exist, we rely on:
- Deep understanding of the problem
- Clear causal reasoning
- Professional intuition and strong product sense.
We can’t test every possible solution. We choose the one that feels most inevitable from a logical standpoint, then use data not as proof, but as feedback to keep us honest.
