Designing with Behavioral Economics

Innovating UX Practice

Inspirations from software engineering

A column by Peter Hornsby
June 7, 2010

Much of economics theory is based on the premise that people are rational decision-makers. In recent years, behavioral economics—also known as behavioral finance—has emerged as a discipline, bringing together economics and psychology to understand how social, cognitive, and emotional factors influence how people make decisions, both as individuals and at the market level. Many of the findings of behavioral economics have a direct influence on how users interact with a product. In a worst?case scenario, a product’s design may encourage user behaviors that are detrimental to users’ best interests.

To understand this, let’s take a look at the video of the Selective Attention Test shown in Figure 1 and follow the voice-over instructions.

Champion Advertisement
Continue Reading…
Figure 1—Selective Attention Test

This video demonstrates some of the flaws in human perception: We miss a lot of what is going on around us. Some participants in the original experiment refused to believe they were looking at the same video clip when they saw it a second time.

When we create personas to better understand our users, behavioral economics—like psychology—can help us to understand the common baseline for human behaviors and capabilities, particularly as they relate to decision-making. Let’s look at some examples of this in action and see how we can use behavioral economics to design more effectively.

Opt-in Versus Opt-out Questions in Web Forms

Rates of organ donation in different countries are remarkably consistent. Within a given country, there is very little deviation, though in some countries, it’s around 80%; in others, around 20%. It turns out that the key factor is not cultural or religious or any of the other factors that might seem apparent. The most important factor is the way a Web site poses the question about organ donation. Where organ donation hovers around the 20% mark, users must opt in to donate their organs. In countries with 80% donation rates, people must opt out of donation.

Deciding whether to donate one’s organs after death is a complex and important decision, and this is a decision most people simply don’t want to have to make. They need to consider the impact their decision will have on their family and friends and confront their own attitudes toward giving their organs. These are considerations that are applicable to all users. Designing a Web form that asks users complex, personal questions requires a designer to make ethical choices and protect users’ best interests.

Problems of Excessive Choice

One of the reasons the Web has done so well as a sales medium is the abundance of choices etailers can offer. However, a study of jam shopping demonstrated the hidden risks of offering too much choice. Experimenters found that when they offered shoppers a choice between a larger and a smaller assortment of jams, people showed much greater interest in the larger assortment, and many more shoppers came over to browse. But this greater interest did not translate into additional sales. Shoppers were 10 times more likely to make a purchase if there were only six types of jam on offer rather than 24.

In another study that looked at participation in 401(k) retirement plans, researchers observed a similar trend. These plans offer huge incentives to participate, including tax breaks and employer contributions. At the same time, they typically present a huge amount of choice. The researchers observed a similar pattern: with just two choices, 75% participated, but with 69 choices, only 60% participated. The 401(k) study demonstrated problems similar to those that people encountered with the organ donation form—considering one’s own mortality, as well as an overabundance of choice

A UX designer can manage problems of excessive choice in several ways—for example, by providing good defaults. However, as with the problem of opt-in versus opt-out questions in forms, it is essential to consider the use of defaults within an ethical framework. Designing an effective information architecture for a Web site or using design patterns such as Faceted Search can also help in managing complexity.

Value Judgments

Particularly in unfamiliar situations, people make value judgments based on the information available, but they do not treat information equally. Dan Ariely provided a great example from The Economist, which offered three types of 1-year subscriptions, as follows:

  • a Web subscription to, for $59
  • a print subscription, for $125
  • a print and Web subscription, for $125

Why offer a print subscription on its own at all? People can be very bad at judging the value of things, particularly things they buy infrequently. They rely on contextual information to understand when they are getting a good deal. Ariely conducted an experiment in which he presented these three options to a group of 100 MBAs, and 84% chose the print and Web subscription, with all others choosing the Web?only option.

He then conducted a second study with a different group of 100 MBAs, presenting only two options:

  • a Web subscription to, for $59
  • a print and Web subscription, for $125

Now, only 32% chose the print and Web subscription. With three options available, people anchored on the print subscription, which made the print and Web subscription look much, much better by comparison. They didn’t know whether $59 for a subscription to was a good deal, but choosing between just two options was easy!

UX designers frequently hear variations on this: But we have smart users! They may be smart, but the basic wiring of people’s brains is always the same. People make judgments based on the information available to them, and UX designers control the information that a Web form presents.

The Role of Context

To a huge degree, people understand the world through other people. Herding refers to the tendency of people to follow the behavior of a larger group. The social context of a Web site—its user?generated content—helps people to understand the site. For example:

  • Amazon customers can provide reviews that give other customers an understanding of the value of a product, and other customers can rate those reviews for usefulness. Crucially, customers perceive greater value in information from independent sources than for information manufacturers provide on Amazon. (Some companies exploit this reality by posting positive reviews of their own products. See “Is iFlorist the greatest website in the universe, ever? for an amusing example.)
  • Nike and Garmin both offer systems that capture data for runners. People can use this information to measure how well they’re doing—either relative to others or in regard to their own improvement over time. (Wired wrote a terrific article, “The Power of Personal Metrics.”)
  • Facebook provides one of the most shameless examples of herding. Not only is the deactivate link—not a close account link—hidden in the account settings, but they present a page showing users several photos of their friends, with text that says Fred will miss you.

UX designers have a responsibility to use herding in way that reduces the likelihood of users who have vested interests from gaming a system and also supports the interests of other users. For example:

  • Be mindful of how you present user feedback—for instance, allow users to rate others’ comments, and present the highest-rated comments first.
  • Understand what facets of the available data are important to users and will support them in making decisions.


Some of the behaviors I’ve described in this column are probably familiar to readers who have a background in psychology. Even so, the impacts of these behaviors are likely to be surprising, even counterintuitive. For example, we live in a culture in which more equals better, so people typically consider more choice a positive thing. However, even smart people make bad decisions. Small design decisions we make can have a huge impact on user behavior. Understanding the principles I’ve outlined in this column helps us to design systems in such a way that we support the best interests of users. 


Ariely, Dan. Predictably Irrational: The Hidden Forces That Shape Our Decisions. London: HarperCollins, 2008.

Iyengar, Sheena S., and Mark R. Lepper. “When Choice Is Demotivating: Can One Desire Too Much of a Good Thing?PDF Journal of Personality and Social Psychology, June 2000.

Johnson, Eric J., and Daniel Goldstein. “Medicine: Do Defaults Save Lives?PDF Science Magazine, November 2003.

Director at Edgerton Riley

Reading, Berkshire, UK

Peter HornsbyPeter has been actively involved in Web design and development since 1993, working in the defense and telecommunications industries; designing a number of interactive, Web-based systems; and advising on usability. He has also worked in education, in both industry and academia, designing and delivering both classroom-based and online training. Peter is a Director at Edgerton Riley, which provides UX consultancy and research to technology firms. Peter has a PhD in software component reuse and a Bachelors degree in human factors, both from Loughborough University, in Leicestershire, UK. He has presented at international conferences and written about reuse, eLearning, and organizational design.  Read More

Other Columns by Peter Hornsby

Other Articles on Software User Experiences

New on UXmatters