Envisioning the Future of User Experience

By Paul J. Sherman

Published: April 9, 2007

Welcome to my UXmatters column—Envision the Future. In this column, I will share my perspectives on the role UX professionals will play in the future and answer a few forward-looking questions about the field of user experience such as:

What is the future of user experience as a practice, as a philosophy of design, and as a research topic?

What are the challenges and opportunities facing UX practitioners as we strive to better integrate our methods, processes, and philosophies into traditional ideation, design, and development processes?

“User experience happens whether someone has designed the elements influencing a user’s experience thoughtfully or accidentally.”

These are big questions. User experience happens whether someone has designed the elements influencing a user’s experience thoughtfully or accidentally. Anywhere there’s a user interface, there’s an interaction waiting to happen and a user experience about to occur.

I intend to explore these areas of inquiry in several ways. I’ll write about particular topics relating to the questions I’ve posed and carry out interviews and discussions with a variety of people in our field—from the visionaries to the UX managers and individual contributors who daily create and validate the user experiences of products and services.

One other thing: I’m going to try my absolute best to avoid the platitudinous, self-important pontification that can afflict commentators in our field. Feel free to slap me down if you think I’m straying close to this line—or cross it. Okay, enough with the meta-stuff.

Here are just a few questions I think the practitioners in our field should be asking themselves and that I intend to explore through this column with your help and assistance.

The User Experience: Is It All in People’s Heads?

Isn’t the user experience something that really occurs only between the ears? Certain popular definitions imply this, but never explicitly state it. Here’s one straw-man definition from Wikipedia: “the overall experience and satisfaction a user has when using a product or system.”

If the user experience is all in the mind, can we really design it? Aren’t we just influencing the user experience rather than designing it? At first blush, this question seems overly academic, but maybe it’s worth exploring. Can we really work on something if it’s not entirely clear to us what’s under our control and what is not?

Why Do Users Blame Themselves?

“We fall victim to the fundamental attribution error when we overattribute other people’s behavior to their personalities.”

Some of the most robust experimental findings in social psychology relate to attribution theory, the fundamental attribution error, and the actor-observer bias.

Attribution theory states that we have a strong—and some say possibly hardwired—propensity to attribute other people’s actions to stable factors that are intrinsic to an individual person. This is known as the fundamental attribution error. We fall victim to the fundamental attribution error when we overattribute other people’s behavior to their personalities. In this situation, we usually tell ourselves that other people are behaving a certain way, because, well, that’s just the way they are. The flip side of the fundamental attribution error is that we tend to attribute our own actions to the particulars of the situation we happen to be in.

Many researchers refer to these dual tendencies toward overattributing others’ actions to personality and our own actions to situational factors as the actor-observer bias.

But here’s the thing: Over and over, I’ve seen users—regular people, not technical types—overattribute their difficulty in using computers or technology to something about themselves: “I must be dumb, because I can’t figure out how to do this,” “Wow, I really must be clueless, I can’t find the button I’m supposed to click,” and so on. I know many of you have observed the same thing. Quite often, this self-blame is associated with increased negative affect. In other words, time and time again, people experience difficulty, blame themselves, and pretty soon start feeling badly.

Why do people blame themselves when they run into usability problems? The actor-observer bias suggests that if people are struggling with a user interface, they’ll attribute their difficulty to external, not internal factors. This reversal of a robust finding in experimental psychology is striking. If we knew what accounted for this phenomenon, might we be better able to design user interfaces so users doesn’t blame themselves and feel badly about themselves and technology?

Usability Interrupted

We know that there is one constant across almost all work environments: Interruptions and distractions characterize most work. Yet when we usability test products, we often test them in laboratory conditions, with interruptions intentionally eliminated or kept to a minimum.

Does this make sense? If the prevailing context of use for many business applications is rife with interruptions and distractions, should we consider this factor when designing interactions? Would studying how users fare with work-like interruptions and distractions yield valuable data for product design teams? Should there be guidelines for the design of applications people are likely to use under conditions where there are frequent interruptions? Just the fact that I’m asking these questions means that I think there should be. But this begs a larger question: Just how does one go about designing an application for the distracted?

The Nested Folder Metaphor: No Longer Sufficient

“I’m devoting too much cognitive effort to maintaining a mental map of my main computers’ folder structure.”

I don’t know about you, but I often struggle to remember where I put my documents and other digital objects. And I’m constantly making on-the-fly taxonomic decisions—and, inevitably, consistency errors—when I create new nested folder structures. Here’s just one example: Prior to 2003, I was organizing my family pictures by month and year, using this scheme:




And so on. That’s all well and good, but evidently I forgot about that taxonomic decision, and in 2004, I started naming my folders like this:




Now, there probably won’t be huge practical implications for this inconsistency—or will there be? Who knows what’ll happen ten years from now when I try to revisit my, by then, ancient digital photos. Will I be able to find them?

Bottom line: I’m devoting too much cognitive effort to maintaining a mental map of my main computers’ folder structure. Sure, I bet I could find a great new scheme for organizing my data on lifehacker.com or 43 Folders, but the point is that I’d still be adjusting myself to work within the confines of this decades-old organization scheme.

There’s got to be a better way. Google is trying to get the world to adopt its desktop tools for content retrieval. But keyword search is so after the fact. Is there a better, more natural way of organizing information on a multi-purpose computing device such as a PC? Or maybe we need multiple schemes—like folksonomy construction through tagging in combination with various other ways of organizing content.

Raising Your Voice

“Every last one of us has had at least one bad experience with an interactive voice response system….”

Speech-enabled user interfaces are becoming more prevalent as organizations try to leverage speech recognition as a means of controlling labor costs. Every last one of us has had at least one bad experience with an interactive voice response system (IVR) or speech-recognition-capable application. It seems that the VUI (Voice User Interface) world needs as much help now as the GUI (Graphic User Interface) world did fifteen years ago. How are good VUI experiences created, and who’s doing the work of improving the VUI experience?

Pardon Me While I Reboot My Pants

In a similar vein, there are many as-yet-unsolved issues in the area of pervasive/ubiquitous computing. As the computer—and our means of interacting with the computer—move off the desktop and into the world at large, the UX community is challenged on several fronts. What are the special challenges of designing user experiences for, say, an augmented reality, location-aware pair of sunglasses?

I look forward to exploring these and other questions about the future of user experience in this column, but I’ll need your help. Do you have questions you think UX practitioners should be addressing? If so, feel free to suggest them by joining the discussion here. And please feel free to share your perspectives on the future of user experience as well.

In the meantime, I’d like to explore one issue in detail in this month’s column. I call it the problem of the perpetual super-novice.

The Perpetual Super-Novice

“What can applications, operating systems, and Web sites do to better facilitate a person’s progression from novice to expert usage?”

After becoming familiar with computers, desktop applications, and the Web, many people continue using the same inefficient, time-consuming, mouse-driven interaction styles. Others fail to discover shortcuts and accelerators in the applications they use. Why? What can applications, operating systems, and Web sites do to better facilitate a person’s progression from novice to expert usage?

Let’s take a moment and define our terms before we go any further. Here are three classifications for levels of user expertise that I typically employ when thinking about this issue:

  • The beginner—The beginning user has never or rarely used an application, device, or product before. For beginners, almost every interaction with a system is exploratory. Their physical movements—and/or the on-screen representations of their movements—are mostly explicit and thoughtful. In this phase, users are trying to figure out what a system does and how they can use it. They are actively creating and modifying their mental models.
  • The novice—The novice user has ascended the learning curve somewhat. Novice users have committed certain basic operations of a system to memory—cognitive or muscle memory. They are comfortable within a circumscribed area of a system’s total functionality. Their mental model of how and why a system behaves as it does is by no means complete—and in fact, might be quite inaccurate. But their limited knowledge has no adverse effects, so long as novice users stay within their comfort zone. If novice users need to learn a new area of functionality, their behavior reverts to that of a beginner while learning.
  • The expertThe expert user not only has mastery over many aspects of a system, the user’s mental model of the system is complete and accurate enough that learning a new area of functionality occurs rapidly and easily. Expert users not only know a system; they know how to learn more about the system.
“People become experts when there is a strong extrinsic motivation to do so.”

Certainly, people become experts when there is a strong extrinsic motivation to do so. For example, you might learn how to do a mail merge in your word processing application, because, well, the boss just asked you to do a mail merge. But in the absence of extrinsic motivation, it seems that many people stay novices or, at most, become a form of knowledgeable novice that I call the “super-novice.” Super-novices know a lot about the little part of a system they’re used to and almost nothing about the other parts.

The thing is, people’s mastery of a system in other domains tends to expand naturally as people become more and more experienced with the system. For example, a novice bicycle rider at first sticks to the basics—mount, pedal, brake, turn, dismount. As new riders gain more experience, they tend naturally to explore the capabilities and performance characteristics of the bicycle. Can I hop a curb? What happens if I lock the brakes? How easily can I pop a wheelie?

Most desktop and Web applications do a so-so job of encouraging exploration and increased mastery. Can we do a better job of helping people ascend the learning curve? How? These questions become especially interesting in light of the fact that many systems can easily track how and in what order people perform certain interactions. Are there teachable moments that user interface designers can exploit to encourage exploration and help turn novices into experts? Taking this approach would seem to blur the traditional hard line of demarcation between the application itself and its associated user assistance. Maybe it’s time that line should get blurred.


Your first question, to me, is the better out of the lot. Is the user experience only in one’s head? If so, how could that ever be “designed?”…

The real question needs to be how to ensure a better user experience—wherever it resides?

The simple answer is interacting with and working with users more closely and frequently to better understand and capture what is in their heads. This always translates into better planned and designed sites and products. Ironically, this also seems to be the most neglected aspect.

One comes before the other, and one needs to be done right to get the other right.

J. O’Brien—I appreciate your comments. You said, “The simple answer is interacting with and working with users more closely and frequently to better understand and capture what is in their heads.” My question to you would be: Are the methods we currently use in our field able to do this well enough?

I recall that, a few months back, Chauncey Wilson did a “30 methods in 30 days” exercise on the STC usabiliity listserv. I remember thinking that some of the methods he highlighted are used only rarely. Yet some would be incredibly effective and about as efficient as any other widely used technique.

I know that I often get myself in a rut when it comes to methods—contextual inquiry, card sort, design, usability test, repeat. Yet we know from the literature that method bias exists, and the best path to truth and beauty lies in approaching a problem from different angles, with different methods.


I think the argument of the super-novice is a very accurate one. The more I have seen my co-workers and myself become more fully integrated, the more I am able to talk to their specific expertises. However, I would never say that I could take over their job functions or the interfaces that they use, as those are their specific areas of expertise.

To your point on the nested folder metaphor, I actually wrote an article regarding natural language processing recently.

For more information regarding some of the other topics in this article, I would also read Don Norman’s book The Design of Everyday Objects.

Hi Jeff, I will go to the link you supplied on natural language processing.

I’ve read—and liked very much—Norman’s seminal book.

As with traditional user interfaces, where we must be careful not to exclude those with certain types of special needs, a similar challenge exists for speech-enabled interfaces, what of those with a stutter or other speech impediment?

I would definitely like to see the 30 Methods in 30 Days article.

I would say there could be other ways to better get at the user information. However, it would also be better if the proper techniques were used in the right ways.

For example, contextual inquiries do not get done often enough. Focus groups are used improperly. Leading questions are asked during various inquiries, and so on. If we just worked to do these things better, we could get better data, too.

Sokha, you are absolutely right. I’m no accessibility authority, but I know my way around the WAI recommendations and Section 508. The general point I extract from accessibility writings is that every new advance in user experience brings both the possibility for greater access and the risk of more exclusion for people with disabilities. (My wife works in the VUI/IVR/speech recognition area. I’ll ask her about accessibility at dinner tonight…:-)

J. O’Brien, I also agree with your points. Regarding the “30 methods in 30 days” information, it wasn’t an article; it was a series of posts to the STC Usability and User Experience SIG list. I don’t recall if they have archives, but you can ask to join here.

Hello Paul,

My area of research is transmedia entertainment, and so I’m interested in inter-platform traversal design. I’ve been able to find very little—nothing?—that deals with this issue specifically. You touch on the domain in your section on rebooting pants when you mention pervasive computing, but concentrate on—you could say—interfaceless interfaces rather than multi-platform experiences.

For your interest, I’ve sketched some ideas in this primer document: Patterns of Cross-Media Interaction Design: It’s much more than a URL [pdf].

I’d love to hear your and any of the readers’ thoughts on this area or any reading you recommend.

“Inter-platform traversal design”, eh? I have to say, I’m intrigued by that phrase. I’ll check out the link.

Thanks, Paul

Hi Paul,

This is in response to the section here about nested folders and difficulty in finding files hidden behind inconsistently named folders.

I would like to point you to FoundIt (www.artbrush.net/foundit), a tool that we have designed to overcome some of the findability issues relating to desktop files and documents. This tool is based on the Google Desktop API and enhances Google Desktop in several ways: it allows for a persistent view of all of your files; it allows for tagging of your files; it allows you to create new documents based on chunks of existing files. Take the product tour.


Hi Paul, I am new to the forms and page layouts of Web-based applications. Can you plese let me know how to improve my knowledge of those topics.

Thanks in advance.


Join the Discussion

Asterisks (*) indicate required information.