Top

Measuring the Value of UX

February 6, 2018

What do I mean by value? The value of a UX design or digital project equates with the impact the project makes.

Impact is the tangible change that research provides—be it in policy, business, industry, or society.”—Oxford University e-Research Centre

Given the ubiquity of technology in our lives, there has, until recently, been a surprising lack of work to assess the digital impact of User Experience. To determine the success of a Web site, we turn to data on conversions and analytics that indicate user behavior, but what about the wider impact? What is its lasting value? This kind of value is impossible to describe in purely quantitative terms.

In this article, I’ll present a robust approach to measuring the digital impact of projects, using a framework that I developed: the Digital Impact Framework (DIF). The DIF can answer specific questions about a project’s impact. It is also a strategic tool that decision makers can use in planning and prioritizing the appropriate goals and objectives, which can actually generate the identified impacts.

Champion Advertisement
Continue Reading…

Ideally, you should use the DIF at the outset of a project to help clients shape the value that a project will deliver and, thus, its impacts—both those that are internal to the organization, as well as external impacts.

Drivers of Digital Impact You Can Measure

The DIF considers four drivers of digital impact:

  1. Economic—You can assess the regional and national impacts of a project or program of work in terms of its gross value added (GVA).
  2. Social—You can understand the human, societal impacts of an intervention—such as greater well-being, improved infrastructure, or a cleaner environment.
  3. Process—You can analyze the ways in which organizational processes will be more effective and improve morale.
  4. Innovation—You can measure spin-off ideas and novel products that you’ve developed as a result of your project or program.

By including value drivers that are internal to the organization, as well as those that are external, the DIF overcomes senior management’s tendency toward fundamental attribution error (FAE)—the human tendency to overemphasize personal characteristics and ignore situational factors when judging people’s behavior. This cognitive bias occurs when organizations regard internal successes positively, but underestimate the impact of external factors.

The Foundations of the DIF

“One way of removing silos and focusing on the entire business is to leverage acquisition, behavior, and outcome metrics. This will allow, nay force, our senior business leaders to see the complete picture, see more of cause and effect, and create incentives for the disparate teams to work together.”—Avinash Kaushik, Digital Marketing Evangelist at Google

Evaluative approaches are an emerging area of user research for digital products and services. But because of the scope of these projects and the ever-changing digital landscape, formulating an assessment framework is quite tricky. To create the DIF, I drew from an array of assessments, with particular emphasis on the gold standard for governments—the logic model / theory of change approach.

A logic model graphically represents a project’s components. Creating a logic model helps stakeholders clearly identify measurable links between what they put into a project—the inputs; what they do—the activities; and what the results are—the outcomes and impacts. The theory of change links outcomes and activities to explain how and why an organization expects a desired change to come about. Drawing these two processes together supports a robust, causally driven series of metrics that show impacts.

The DIF is a hybrid framework that works well in enabling an agency like Nomensa—the company I work for—to evaluate the success of our client work internally and understand how immediate UX outputs have concomitant outcomes and impacts. It also constitutes a service that we can offer our clients to help them approach their own impact goals strategically—whether those goals are for the immediate future or the longer term.

Next, let’s look at some kinds of organizations for which the DIF works well, how you can implement it, and steps you can take in using it for your clients.

Ways Different Types of Organizations Can Benefit from a DIF

Various kinds of organizations and businesses can benefit from a DIF in different ways, as follows:

  • privately or publicly owned enterprises—A DIF shows them how they can improve processes and the impacts of digital investment in the future—that is, improve GVA and return on investment (ROI)—to derive economic and process value.
  • smaller companies—Such companies—including creative small and medium-sized enterprises (SMEs) and startups—want ex ante evidence of the potential impact of investment to secure funds and gain stakeholder buy-in that would enable them to derive economic, social, and innovation value.
  • government agencies and nondepartmental government bodies (NDGBs)—Such organizations often have to show evidence of the impact of public funds they’ve invested in specific projects and their social and process value.
  • charities, cultural-heritage organizations, higher education, and others—Demonstrating the impact of their previous work helps them in securing grants or other funding for specific future projects—for which social and economic value is central, but innovation value is increasingly important.

The DIF in Action

Table 1 shows a DIF for a hypothetical case that is based on an actual customer scenario.

Strategic Context: A national energy provider is looking to build a novel artificial intelligence (AI) chatbot to:

  • Respond to customer queries in a fun, witty manner.
  •  
  • Help employees to focus on more challenging inquiries.
  •  
  • Increase customer interest in and excitement about the brand.
  •  
  • Provide competitive advantage via intelligent upselling of products.
Table 1—DIF for a national energy provider’s investment in AI
Inputs Activities Outputs Outcomes Impacts

£ budget, staff time, and resources to create chatbot

Research, planning, ideation, design, development, training, and monitoring launch of chatbot

£1m new sales and turnover; 10 new FTE; £1m of retention of sales that would have moved to a competitor

Indirect and induced turnover and FTE outcomes—increasing turnover and FTE leads to increase spending in region

Economic: £5m regional GVA generated, considering net additional impacts—above what was projected; ROI and Cost-Benefit Analysis (CBA)

5000x users successful in completing tasks; 1000x starting new, relevant sales journeys

Improved user experience; usability testing more successful; increased user confidence up 20%; task completion up 25%; 20 new stakeholders supporting work

Social: Wider audience activation—social listening shows 50% increase in positive sentiment; 5 stakeholders in other organizations inquiring or supporting

5x staff trained in using chatbot internally; maintenance, up-skilling, and cross-skilling

Staff confidence increased by 30% and job satisfaction by 20% through understanding of resource and how to use it

Process: Increased staff effectiveness in working on day-to-day leads and savings of £50k/annum

1x novel, witty AI chatbot launched; integrating upsell; 1x patent application

Chatbot project generates 5x spinoff ideas through iterative improvements—drawing upon multiple innovation types

Innovation: Chatbot AI used by 3x departments; AI adopted by 1x external organization

For larger programs of work, we would consider all four value drivers. However, in typical use, we would work with our client to determine the two key drivers on which they should focus—that is, those drivers that are most relevant to their project.

Good Times to Measure the Impacts of a Project

Ideally, the impact tools should be in place at the outset of a project. However, there are advantages to measuring impacts at all stages of a project, as follows:

  • beginning—Assessment allows organizations to determine a project’s expected impacts and define targets.
  • middle—Mid-term evaluation lets organizations gauge impact so far and lessons learned in working toward project completion.
  • end—Assessing overall impact and future sustainability allows organizations to tap into live impact data.

Starting a DIF Assessment

To launch a DIF assessment, follow these steps:

  1. Through secondary research and client meetings, understand the context and rationale for the digital product, project, or resource you’re undertaking
  2. Hold a workshop on one or two high-priority impact drivers for the project and client—for example, innovation, internal process improvement, or social or economic value.
  3. Establish workable (SMART) metrics for outputs, outcomes, impacts. Consider the counter-factual—things that might happen anyway.
  4. Devise monitoring methods, tools, and processes and put them in place—within either the agency or the client organization—to gather data as the project proceeds.
  5. Conduct user research—for example, surveys, interviews, and usability testing—to gather and analyze data.
  6. Report your findings—returning to the client at six, twelve, and eighteen months—to update your data analysis and reengage with your client.

Learning More

Are you at an early stage in considering new digital practices? Do you need to map the trajectory of your work? Are you a well-established organization seeking to demonstrate your wider societal impacts? We’re excited to hear from more businesses and leaders who want to explore their organization’s digital culture and deliver excellent experiences. 

Further Reading

Senior UX Consultant at Nomensa

Bristol, UK

Tim DixonTim is an expert in quantitative research methods and has carried out digital impact evaluations for a range of public, private, charity, and higher-education sector clients. In 2007, he completed his doctorate in human factors psychology and applied vision. Through lab-based testing, qualitative trials, and computational assessment, he has explored the usability of fused images and videos for a range of scenarios and user groups and generated a wealth of research findings. More recently, he has been exploring the world from the perspective of experimentation and technology. With a love of data and analysis, Tim has sought ways to better grasp the digital world of the present and future, leveraging the strategic understanding he has derived from impact evaluation and user research. Tim presented on the Digital Impact Framework at UX Bristol 2017 and is looking to take his measured approach to digital impact assessment forward with Nomensa in 2018.  Read More

Other Articles on Value of User Experience

New on UXmatters