The Ethics of User Experience Design

Innovating UX Practice

Inspirations from software engineering

A column by Peter Hornsby
November 20, 2017

About six months ago, I left Facebook cold turkey. I had tried leaving it before, but always ended up going back. It wasn’t the not‑so‑subtle hints or wanting to see the pictures of all my friends—or at least people with whom I’d connected on Facebook—that drove me back, but fear of missing out (FOMO). What if something happened to someone, and I wasn’t aware of it? What if I thought of a witty one-liner and couldn’t share it immediately with a group of people, then bask in the adulation they would inevitably provide in response to my genius? What about that important political opinion about Trump or Brexit that I’d need to share among my fellow right‑thinking people?

You know what happened? Nothing. Nothing bad, at any rate.

Breaking Free of Addictive User Experiences

I had decided that the price I was paying for being on Facebook was too high. I’d get drawn into arguments and find myself getting annoyed and frustrated. Something is wrong on the Internet! I’d find myself checking my account far too frequently. There’s nothing inherently wrong with either of those things—except that they took time away from other, more important things that I felt I should be doing. I’d read an excellent piece about why one person left World of Warcraft.

Champion Advertisement
Continue Reading…

My take? Life is short. If we’re lucky, we get three score years and ten—maybe more, thanks to modern medicine. Once we figure out what we want to do with our lives, we should take the time to reflect on how to achieve our goals, then get on with it. I have friends, a family, and work commitments. So, after some reflection, I ultimately decided that the time I was spending on Facebook was time that I would rather be spending elsewhere. Looking at the time I spent on Facebook objectively, it was probably not a huge amount of time. But subjectively, it felt like I was spending too much time and mental energy on something that wasn’t worth the price—particularly if I got drawn into an argument with strangers or possibly bots. Worse, the emotions I felt while I was engaged with Facebook bled into the rest of my day, affecting how I was feeling.

The Cynicism of Designing Addictive User Experiences

So why am I sharing this decision and baring my soul to you? At some point—at the requirements stage or during design—a group of people cynically decided to create a user experience that delivers the type of habit‑forming behaviors that Facebook stimulates. I’d been willing to entertain the unlikely possibility that the habit-forming elements of Facebook may have come about as unintended consequences. But they didn’t.

Earlier this month in an Axios interview, Sean Parker, Facebook’s first President, admitted:

“The thought process that went into building these applications—Facebook being the first of them to really understand it—that thought process was all about: How do we consume as much of your time and conscious attention as possible. And that means that we need to sort of give you a little dopamine hit every once in a while—because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content. And that’s going to get you more likes and comments. It’s a social-validation feedback loop. … You’re exploiting a vulnerability in human psychology. The inventors / creators understood this consciously, and we did it anyway.”

The Role of Ethics in Design

I have occasionally considered the role of ethics in design. In this column, I’m using the word design in its broadest sense, as an activity and a set of outputs that all participants in the design process shape. While UX designers are not solely responsible for design decisions, personal responsibility is always a good starting point when you’re looking to change the world! Like software engineering, UX design tends to start with a simplified vision of the way the world works, making decisions based on that simplified vision. Any simplification is necessarily reductive in nature. Things get stripped away, and system boundaries are more often implied rather than drawn clearly. For example, the design process may not have considered the way the emotions Facebook engendered bled into my day or the subjective frustration I felt in response to certain online interactions.

On the whatusersdo Slack community that I sometimes visit, the topic of ethics came up during a discussion of dark patterns. This, in turn, led to speculation about what a Hippocratic Oath for UX designers might look like in practice. I like the idea of something building on the geek heritage of comic books and films—“With great power … comes great responsibility,” “Beware of the Dark Side … anger, fear, aggression, the Dark Side of the Force are they.” But I started thinking instead about an older inspiration: Isaac Asimov’s Three Laws of Robotics, from his short story “Runaround”:

  1. “A robot may not injure a human being or, through inaction, allow a human being to come to harm.”
  2. “A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.”
  3. “A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.”

Adapting these slightly for UX design, we might get:

  1. A UX designer may not injure a user or, through inaction, allow a user to come to harm.
  2. A UX designer must meet the business requirements except where such requirements would conflict with the First Law.
  3. A UX designer must develop his or her own career as long as such development does not conflict with the First or Second Law.

I like Asimov’s approach because it considers not only what to do, but also what not to do. A UX designer cannot be a passive cog in the machine, delivering on requirements without questioning them. A good UX designer must actively question business requirements to understand what is behind them and, where necessary, challenge them. As with other professionals, UX designers may sometimes be offered work or presented with challenges that, although intellectually challenging, raise ethical issues regarding the impact of that work on users. This was the case in the story “Inside the Online Bookies”:

“We didn’t advertise our offers through email campaigns; we offered ‘suggestions on opportunities.’ We didn’t cross-sell; we ‘enhanced the interests of our users.’ Or that’s what we were told, anyway.

“The idea was to create a more bespoke, sophisticated online environment than our competitors. The doublespeak was used to conceal the fact that we were fundamentally guided by the same principles: naked self-interest and an indifference to the social harm our work could cause.”

Applying ethical thinking to UX design cannot be just about the end goal. This requires constant vigilance—regarding not only the explicit consequences of the designer’s work, but also the hidden, unintended consequences. 

Director at Edgerton Riley

Reading, Berkshire, UK

Peter HornsbyPeter has been actively involved in Web design and development since 1993, working in the defense and telecommunications industries; designing a number of interactive, Web-based systems; and advising on usability. He has also worked in education, in both industry and academia, designing and delivering both classroom-based and online training. Peter is a Director at Edgerton Riley, which provides UX consultancy and research to technology firms. Peter has a PhD in software component reuse and a Bachelors degree in human factors, both from Loughborough University, in Leicestershire, UK. He has presented at international conferences and written about reuse, eLearning, and organizational design.  Read More

Other Columns by Peter Hornsby

Other Articles on UX Design

New on UXmatters