The Research That Matters
In the fast-paced world of design agencies, user research is a familiar buzzword—frequently discussed, but too often underdelivered. At Fusion Studios, we take a different approach. Our focus is on research that is deliberate, rigorous, and directly tied to decision-making.
A Case Study: Adonis
Our recent collaboration with Adonis exemplifies our philosophy. Operating within the complex domain of healthcare revenue cycle management, we set out to understand how industry leaders interpret, trust, and act on data—an essential factor in improving financial outcomes for healthcare providers.
Turning Insight into Action
Adonis was not just another client engagement; it was an opportunity to redefine how data-driven narratives support real-world decisions in healthcare. Over a two-month research phase, we concentrated on uncovering the realities leaders face—not the assumptions often projected onto them.
Moving Beyond “Research Theater”
Many research efforts fall into the trap of research theater: activities that mimic meaningful inquiry but rarely influence outcomes. Common examples include:
- Surveys shaped by leading or biased questions
- Usability tests designed to confirm existing beliefs
- Personas built on assumptions rather than evidence
- Journey maps that look polished but drive no action
Our approach intentionally challenges these patterns. We treat research as a mechanism for pressure-testing assumptions and guiding confident decisions—not as a box-checking exercise.
Research Designed to Drive Decisions
1. Problem Discovery
The foundation of effective research is a clear understanding of the problem space. Our discovery phase focused on identifying genuine needs and constraints by:
- Engaging with over a dozen revenue cycle management leaders
- Exploring their goals, obstacles, and daily workflows
- Observing the workarounds they rely on today
Methods: Contextual inquiry, diary studies, stakeholder interviews
A consistent theme emerged: existing tools fail to provide a cohesive view of critical data, forcing leaders to rely on multiple systems to piece together insights. This fragmentation pointed us toward opportunities around data flexibility, advanced visualization, and interpretable AI—capabilities our users are actively seeking.
2. Concept Validation
Before committing to development, we validate ideas against real user expectations by asking:
- Does this concept address a genuine problem?
- Is it understandable, credible, and desirable?
- What concerns or resistance might it surface?
Methods: Concept testing, fake-door tests, think-aloud sessions
We returned to stakeholders with a spectrum of concepts—from conservative to exploratory—to understand comfort levels and boundaries. This process enabled us to recommend a direction that balanced user needs with business objectives while addressing the core pain points uncovered during discovery.
3. Usability Evaluation
Prior to launch, we rigorously evaluated usability to ensure that:
- Core tasks can be completed efficiently
- Friction points are identified early
- Mental models align with interface behavior
- The experience compares favorably to alternatives
Methods: Moderated usability testing, comparative testing, heuristic evaluation
4. Outcome Measurement
Research does not end at launch. Measuring real-world impact allows us to assess:
- Whether behaviors change as intended
- If user goals are being met
- Which unexpected patterns emerge
- Where future improvements should focus
Methods: Product analytics, A/B testing, longitudinal studies
Choosing the Right Sample Size
Effective research is defined less by participant volume and more by decision clarity. For qualitative insights, small samples (typically 5–8 participants) are sufficient to uncover meaningful patterns. Quantitative validation, by contrast, requires statistically significant samples to support confident conclusions.
Recruiting the Right Participants
The quality of insights depends on who you speak with. We prioritize:
- Screening participants based on real behaviors, not titles alone
- Including edge cases and power users
- Building long-term relationships with participant panels
We worked closely with Adonis to ensure thoughtful introductions and warm handoffs, aiming to establish durable relationships rather than extract one-off feedback. Given the complexity of the product, a single research round would have been insufficient. By respecting participants’ time, accommodating demanding schedules, and committing to ongoing engagement, we strengthened both the quality of insights and the eventual product outcomes.
From Data to Direction
Raw research only becomes valuable when it is translated into action. Our synthesis process includes:
- Affinity mapping to surface patterns
- Journey mapping to visualize experiences
- Clear, decision-ready insight statements
- Recommendations explicitly tied to evidence
We involve stakeholders directly in synthesis to encourage shared understanding and ownership of the insights.
Embedding Research into the Product Process
Research is most effective when it is continuous rather than episodic. Our cadence typically includes:
- Weekly: Lightweight usability testing
- Monthly: Deeper investigation into targeted questions
- Quarterly: Strategic user and market research
- Ongoing: Analytics review and customer feedback
We also set an internal standard to engage with clients at least twice a week. This discipline helps prevent design teams from becoming over-contextualized or disconnected from reality. Regular user touchpoints—even brief ones—keep evolving needs visible and assumptions in check.
Common Pitfalls to Avoid
- Asking users to design solutions: Users are experts in their problems, not in product design.
- Testing too late: Early research prevents costly downstream corrections.
- Confirmation bias: Actively seek evidence that disproves assumptions.
- Overreliance on a single method: Diverse approaches produce more resilient insights.
Making Research Work
When executed with intent, research reduces risk, improves outcomes, and builds confidence across teams. The key is clarity: define the decisions you need to make, select the right methods, and commit to acting on what you learn.
That is the research that matters.