Top

The Icarus Pattern in UX Research

January 26, 2026

What is the return on investment (ROI) for UX research? For every dollar that organizations spend on UX research, the return is roughly $100. However, in recent months, I believe that UX researchers are following an Icarus pattern: rising quickly into relevance, then losing altitude just as fast.

UX research is losing traction not because it lacks value, but because many organizations misjudge what doing it well takes and have been unable to define the correct return on investment for their UX research.

In organizations where UX maturity is low, leaders admire the idea of being user-centered—mostly based on sales feedback or market research—but they often underestimate the cost, the time, and the operational maturity that is necessary to sustain real UX research. When these assumptions collapse, UX research begins to seem optional, and its relevance fades.

Champion Advertisement
Continue Reading…

Whether out of habit or based on overconfidence, teams often base their assumptions on a single source of insights such as A/B testing, internal feedback, or a single survey. For this reason, some organizations stop seeing UX research as a disciplined practice and start treating it as a collection of interchangeable activities. Thus, stakeholders think UX research is one thing, but the reality is different. They expect speed and low effort and want decisions based on strong opinions and a great deal of easy-to-access information. We need to remember that UX research requires space, criteria, recruitment, and planning.

This is also why UX research sometimes takes the form of a checklist. Teams run a quick evaluation, tick off a few heuristics, and assume they understand the problem. I explored this phenomenon in one of my frameworks: the first step is the checklist, then the second step requires the integration of an extra level of knowledge, where we ask what the checklist missed, the habits, shortcuts, the unexpected. Good evaluation is interpretation, not inspection.

Another reason why UX research is losing traction is synthetic data: from large-language model (LLM) redundancy to real usability-test participants who perform tests only for the incentive rather than truly engaging with the test scenarios. Both cases distort the research signal and convince teams that they already understand their users without actually talking to them. Good UX researchers should be able to understand what data they can use, even if it is slightly flawed. As one disengaged test participant recently said after clicking quickly through a prototype and moments before failing the core task twice: “Everything is fine. No issues.”

Organizations fail to accept the uncertainty that is necessary to find truths, and UX research loses ground because of their short-sighted certainty. These forces—the substitution of method with the demand for personal, opinionated certainty, and the rejection of necessary granularity—create a hostile environment for high-quality research work. We can trace the decline of UX research back to two key breakdowns.

Champion Advertisement
Continue Reading…

When Organizations Misjudge UX Research

UX maturity can help us define the real cost of UX research. UX researchers in an organization with low UX maturity should split their effort between their day-to-day activities and getting buy-in across different teams—even those that are not strictly connected with UX research. At some point, everything will be connected.

Recruiting qualified research participants, matching Web audience segments with market research, navigating privacy rules for usability tests in which participants need to fill out a form or complete a checkout process, and continuous discovery with real users all require time, coordination, and budget. Teams often assume they can run meaningful studies quickly and cheaply, only to discover that high-quality research demands more resources than they expected.

After creating a quick, bullet-point summary that identifies some issues, but priorities and impacts remain difficult to quantify, this misjudgment becomes immediately clear. Although we need to speak the same language as our stakeholders to explain our findings, the organization expects speed. Sometimes inefficiency or a lack of time can impair our ability to build the correct research methodology. One of my stakeholders captured this mindset perfectly: “Can’t we just find ten people internally and run this tomorrow? They’re users, too.” Leadership might feel that “UX research would slow us down,” but the real blocker is the lack of UX maturity, not the method.

When organizations underestimate this operational reality, they start questioning the value of the UX research role altogether. They focus on delivery pressures instead of learning. Plus, the idea of UX research as a cost center becomes an excuse to cut research first.

When Speed Replaces Depth

Data is one of the keystones of UX research, but not the only one. The decline of UX research comes from the belief that faster data gathering or analyzing only the what can replace real understanding, the why.

Synthetic users or behavioral simulations that are generated by artificial intelligence (AI), produce coherent, even fluent answers, but they simulate patterns instead of revealing them. While they can be trained, they cannot replicate 100% of what an actual user is feeling or thinking when using an app, the user’s intent, or how he or she landed on a specific page on a Web site.

Some research participants offer rehearsed or minimal responses because they join studies for the incentive rather than through genuine engagement. They might use the same wording that appears in the questions in their replies, have generic motivations, and their time on task might be below what is typical. I recall a participant assuring the team, “I’d definitely use this feature every week,” despite navigating around it every time.

The use of analytics or A/B testing alone reinforces this illusion by showing what happens, but not why. Their findings are based on personal intuitions that are not supported by having solved specific problems. Many organizations and vendors promote these methods as solutions for attaining UX maturity or ways of improving the user journey without taking an holistic approach. During a recent industry event, a stakeholder described User Experience, without UX research, as “just A/B testing.” This was more than a simple misunderstanding; it reflected how compressed the definition of UX research has become.

We need triangulation—a balanced scale comprehending quantitative, narrative, and contextual evidence. This would prevent premature certainty by revealing biases and friction between perspectives. For example, “I didn’t drop off because it was long. I dropped off because I didn’t trust the information.” A fundamental reason for focusing the team’s attention: UX research can provide a basis to challenge the team’s opinions when analytics show a drop-off at a specific step.

Another participant captured this invisible hesitation clearly: “After filtering, I was 80% confident I had found the right product, but 20% of me still thought there might be a better one I'd missed.” When analytics show that the user interacted with a filter or there is a high scroll rate so engagement is high, we call this a successful interaction. However, we cannot capture the non-confidence—that 20% of users that are missing the right product. Such moments of broken confidence never appear in dashboards; they surface only when UX researchers listen from different angles.

Triangulation provides a clear foundation that combats teams listening to the loudest voice in the room. “I already know what our users want.” “I have been in this business for decades.” Everyone experiences such scenarios at least once in their professional career. We need to turn our meetings with senior stakeholders into learning moments. We need to extrapolate the strategic vision and operational delivery, then isolate people’s strong personal convictions.

To avoid the growing pattern of a decline in UX research, we need to provide a direct line of learning between the organization and users. If organizations stop listening, they stop learning. When teams treat UX research as optional rather than as a necessity rather than “working for the greater good,” stakeholders become yes-men.

Keeping UX Research Visible

No, we are not like Icarus. However by analyzing these structural causes of the decline of UX research, the metaphor becomes clearer. Icarus rises quickly with wings made of feathers and beeswax. Aiming to reach the sun, he was alone in his journey. His wings melted and he disappeared into the water while the world carried on.

Bruegel’s Landscape with the Fall of Icarus captures this precisely. A ploughman works his field. A shepherd looks into the sky. A ship moves across the sea. In the corner, almost invisible, Icarus sinks. The fall was real, but those in the surrounding landscape paid no attention.

Even if organizations behave in the same way, with their focus on delivery, velocity, optimization, and the next operational target, UX researchers need not build our defenses on overconfidence. Our journey is not a one-man show. We need to connect with our stakeholders and improve our capacity to demonstrate the ROI of UX research across the entire team—even if we don’t yet fully understand it.

UX research does not lose its relevance through a dramatic moment such as a layoff due to redundancies or merging teams. It loses its relevance when teams stop noticing what UX research reveals and when biased readings of data replace triangulated understanding. If UX research wants to regain its place, the path is clear: shift your perspectives and restore depth.

The fall of Icarus becomes avoidable only when someone decides to look at reality clearly. UX research stays relevant when we move with the organization, not behind it. We add value when we notice what others miss, connect what others treat as separate, and slow down when everyone else wants to speed up. When we work in this way, UX research gives clarity and adds value instead of stakeholders ignoring what is taking place in the background. 

Senior UX Researcher at NN/g UXMC

London, UK

Alessandro ZulbertiAlessandro has more than a decade of experience conducting UX research in ecommerce and digital product environments. He works at the intersection of research, design, and product definition to help teams make evidence-based decisions under real commercial constraints, translating behavioral insights into clear direction for product teams. Alessandro works collaboratively with design, product, marketing, and engineering to prioritize the framing of research questions and ensure that UX research informs the prioritization of features. He also conducts AI-assisted behavior analysis to derive agent-centered competitor insights and detect sentiment and patterns. Alessandro mentors junior researchers and designers on research planning, synthesis, and stakeholder communication.  Read More

Other Articles on UX Research

New on UXmatters