Augmented Cognition: A Future for UX?
Published: May 6, 2013
“Prediction is very difficult, especially if it’s about the future.”—Nils Bohr
In my last column, I looked at how we could make the Iron Man suit a reality, using existing technologies. Some of the Twitter feedback and comments on that column talked about using brainwaves to control the suit, so I thought it would be interesting to see what is being done in that area.
Prediction, as Nils Bohr noted, can be a dangerous activity, but also fun: looking at trends in technology can help us to manage and prepare for uncertainty—or at least give us the illusion of doing so. Historically, in user experience, predictions of the future have been tied up with inventing it. Doug Engelbart’s “Mother of All Demos,” shown in Figure 1, being the most notable example. If you’ve not already seen this video, I strongly recommend a viewing. In 1968, Engelbart demonstrated videoconferencing, hypertext; a collaborative, real-time editor; and other technologies that we would not fully realize for decades to come.
Figure 1—Engelbart’s “Mother of All Demos”
In fact, in 1968, using the term user experience would have been met with confusion; the term human factors was far more familiar then. But while terms like user experience, customer experience, human factors, and interaction design hint at a semantic mess, they also illustrate the richness of the fields that user experience can draw upon, as well as through which it can develop. (There is a lovely Venn diagram of the fields that contribute to or make up user experience on Visual.ly, but what is even more telling are the comments pointing out lots of areas that the diagram has missed.)
At their core, computers are tools that we use to expand our cognitive abilities. For example, Doug Engelbart was inspired to develop computers as tools for analyzing and displaying information to collaboratively solve important problems. At one level, user experience is about designing tools that focus users’ attention on elements of interest, providing them with the right level of information to make decisions.
In this column, we’ll look at one possible direction in which interacting with computers could go: augmented cognition. Research into augmented cognition started around 2000 as part of a $70 million DARPA-funded research project, bringing together researchers in academia and industry with members of the armed forces. The goal of research into augmented cognition is:
“To create revolutionary human-computer interactions that capitalize on recent advances in the fields of neuroscience, cognitive science, and computer science. Augmented cognition can be distinguished from its predecessors by the focus on the real-time cognitive state of the user, as assessed through modern neuroscientific tools.”
So, essentially, augmented cognition is about understanding the state of a user’s brain and using that understanding to manage the user’s interaction with a computer. For example, if a user were receiving too much information in image form to process it effectively, you might trigger an audio alert to ensure that he responds to another pressing matter. In this way, the user avoids becoming overloaded with information and is in a better position to act appropriately.
Our first step is to understand the state of a user’s brain. When we talk about understanding brain activity, we generally think of people attached via the head to a huge amount of wiring. Indeed, Samsung are using a fairly elaborate headset to develop a device that enables people to operate a computer through brain signals.
While sophisticated systems are in use in research environments, simpler devices are available for home use. On my desk, I have a lightweight headset called the MindWave, from a company called NeuroSky. I can use this headset to monitor my brainwave signals. It’s reasonably comfortable to wear, and while I confess to being a geek, it’s really fun to use, providing enough data of sufficiently good quality to enable me to interact with a range of applications. As the technology continues to reduce in size and obtrusiveness, we’ll be able to introduce these devices into more fields.
Examples from Research in Augmented Cognition
Research in augmented cognition has followed a number of broad trends, including the following:
- employing adaptive interfaces
- supporting training
- speeding up analysis
Employing Adaptive Interfaces
Probably the most relevant area of research in augmented cognition to UX designers is the adaptive interface. Such interfaces use neural data and physiological data such as galvanic skin response to understand the user’s cognitive state to
- present information in the most effective modality for the user at that moment in time
- manage information overload, particularly for users such as pilots, air traffic controllers, and soldiers on an increasingly wired battlefield
In training environments, understanding a user’s cognitive workload can help to drive the presentation of information in the most effective way for the user’s skill level. As expertise increases, much information processing becomes automated: if you drive, you probably no longer need to think about managing the gears—just about how quickly and in which direction you want to drive. Understanding a user’s mental state as part of a feedback process can enable us to deliver training in a way that challenges learners without frustrating them.
Speeding Up Analysis
Although, strictly speaking, neurotechnology came from a separate DARPA program, it has been useful in helping intelligence analysts to increase their effectiveness when searching through large datasets. Thus, it enables them to keep pace with the increasing volume of data available.
Augmented cognition presents one approach to tackling a number of our current challenges in user experience. The biggest of these is the effort that we put into understanding our users. We can conduct contextual analyses and create personas, user journeys, and other artifacts to help us understand our typical users.
We fail if our products put users into information overload. The use of augmented cognition technologies provides an opportunity to better manage situations where users are under stress. Some obvious situations where stress is a factor include people working in safety-critical areas, but also situations where people are trying to manage multiple, competing demands on their attention.