User Research for Personas and Other Audience Models

User Dialogues

Creating exceptional user experiences through research

A column by Steve Baty
April 27, 2009

This is not going to be an article about personas or even what distinguishes a good persona from a bad one. Instead, this article is about the ingredients we can draw on when creating audience models and some alternative ways of communicating the results of an audience analysis.

First, however, let me briefly discuss what we generally mean when we talk about personas and the role they play in the design and development process.

Champion Advertisement
Continue Reading…

A Very Brief Introduction to Personas

Personas are archetypal representations of audience segments, or user types, which describe user characteristics that lead to different collections of needs and behaviors. We build up each archetype where the characteristics of users overlap.

According to Alan Cooper, author of About Face 3.0 with Robert Riemann and David Cronin, “The persona is a powerful, multipurpose design tool that helps overcome several problems that currently plague the development of digital products. Personas help designers:

  • Determine what a product should do and how it should behave.
  • Communicate with stakeholders, developers, and other designers.
  • Build consensus and commitment to the design.
  • Measure the design’s effectiveness.
  • Contribute to other product-related efforts such as marketing and sales plans.”

But where do we start looking for the data we need to build up these useful archetypes.

Research Methods

Several research methods can provide data upon which we can build user archetypes, including

  • surveys
  • ethnographic research
  • interviews
  • contextual inquiries
  • Web analytics


Surveys provide a combination of quantitative and qualitative data, depending on how we structure the questions. Their purpose is usually to generate a relatively large data set in an efficient manner. Compared to other methods of research, a survey is a fairly inexpensive activity to undertake. Surveys are also quite flexible—in that you can ask a wide variety of questions and define the formats of the responses you’d like to see. (Good survey questions provide a clear indication to respondents of the type of information desired.)

As with all research techniques, surveys have their downsides. The quality of the data can be patchy, especially when gathering responses electronically. Respondents are more likely to skip questions or provide only superficial, brusque, or incomplete answers. This is particularly true when respondents perceive questions as touching on personal topics.

Survey responses can also be effectively meaningless—for example, when a respondent doesn't understand a question. Without a researcher on hand to clarify the intent of questions, responses may miss the point, rendering them useless. Good survey questions can mitigate this problem, but cannot remove it entirely.

And of course, we need to analyze the larger volume of data and all the responses to the multiple questions we’ve asked. This can require the use of specialized skills and software, raising the required level of effort. Since our aim is to generate audience segments, we need to perform multivariate analysis on the data, employing techniques like clustering, principal components, and factorial analysis. It simply isn't enough to analyze each question independently and fall for the ‘average user fallacy’.

We also need to recognize that people are generally fairly poor at reporting reality in surveys, particularly when we’re asking them to report on an event that occurred in the past. Questions such as On average, how often do you visit X per week? are bound to result in inaccurate responses. Respondents will offer exceptions, bad estimates, and sometimes make up responses based on their vague recollections. To provide balance to our data, we need other research sources with which to cross-reference survey data.

Ethnographic Research

Ethnographic research includes a broad range of contextual, observational research techniques and offers a range of benefits to UX researchers. We can learn something about the context of use for a product or service we’re designing, including the environment, time constraints, and interruptions and distractions people face when interacting with our designs.

We also get to see what people actually do rather than what they say they do, overcoming a common problem with surveys and interviews. We can see the complex, unvarnished reality instead of the sanitized and tidy version people tend to portray in response to a question. And we can gain an understanding of both mundane, day-to-day activities and the more rare, extreme cases. Imagine the difference in the insights we can gain from a series of survey questions asking a theater nurse to describe her job, versus what we would learn by spending a few days following her around and observing her work.

On the downside, ethnographic studies can be time and resource intensive. Such studies require researchers to be on site with participants for an extended period of time—for days, weeks, or sometimes even months. And while the data we collect during such studies is very rich, it can also tend toward the messy, complicating the analysis process. However, ethnographic studies provide an excellent source of real insights into the audience for a product or service we’re designing.


In terms of research styles, asking potential or current audience members a series of open or closed questions sits partway between surveys and ethnographic studies. Interviews are a good method for gaining insights into users’ opinions, thoughts, and ideas.

In ethnographic studies, researchers look at the actions and behaviors of participants. They interpret what they see rather then asking participants. The interview format allows some flexibility for researchers to explore ideas and motivations that are not accessible to an observer.

Contextual Inquiries

The contextual inquiry research technique combines observation with interview-style question and response. The aim of questions is typically to get participants to explain their actions and, in some cases, we ask participants to speak aloud, telling us whatever they are thinking as they work through a task or activity.

The downsides to contextual inquiry are similar to, but less severe than those of ethnographic studies. To gain sufficient insight, it is necessary to invest time in both the observation and analysis tasks, which represent a substantial effort.

Web Analytics

The user research techniques I’ve discussed thus far focus on the characteristics, needs, and behaviors of individuals. Web analytics let us look at the aggregate effect of these characteristics in action.

Web analytics offer us a view of what happens when people visit our Web site or use our online service. We can identify peaks and troughs in usage and other patterns such as trends and cycles, as I described in my recent column on UXmatters, “Patterns in UX Research.” Web analytics can provide insights we can use in creating our personas—such as activity cycles for different groups of users’ information-seeking behavior, and more.

Note that user research lets us understand why people might not be using our product or service, while Web analytics tells us only about those who are using it already.

It’s also worth noting, for many UX researchers, the likely reality is that their current Web analytics fall short of the objectives I’ve just outlined.

Other Sources of Information

As I’ve discussed previously, in my UXmatters column “Closing the Communication Loop,” we can gain some insights into the concerns and needs of our users through channels such as call centers and stories from our sales staffs.

As removed as such sources are from the primary source—our actual users—we need to treat such data with some caution. However, it can highlight issues your users may not bring up themselves.

Some Advice on Creating Personas

The list of research activities and data sources I’ve presented here is by no means exhaustive. However, these are some of the most commonly used methods and richest sources of information to help you build your personas.

One important thing to consider about these different research techniques is that each of them is good in certain ways and can provide insights into different characteristics of your audience. A common refrain among UX practitioners who are looking at personas is to draw upon as many different sources of data as you can. This helps you create a much richer representation of each different persona, but also helps you arrive at much stronger set of personas. Each data source has its own built-in bias, so combining data sets helps mitigate that bias.

Another common piece of advice from UX practitioners is that we should base personas on user research as their primary input. Sources such as Web analytics, call center logs, or stories from front-line staff are interesting, but are not necessarily rich enough sources of information.

Todd Zaki Warfel, Principal Design Researcher at Messagefirst, encourages designers not to start out with a predefined number of personas in mind. Instead, we should let the data tell its own story, and our analysis should determine the proper number of personas. Todd offers another piece of advice, which has helped personas fulfill an important role in his design process:

“We always use a real person—someone we know personally—as the example user for each persona. It’ll be a friend or a friend of a friend, but it’s someone we can call and ask questions. That detail really helps make each persona more real and approachable to everyone on the team.”—Todd Zaki Warfel

Communicating Your Research

Personas are a popular, commonly used technique for communicating the insights we’ve gained from our research activities. But there are two alternative techniques worth looking at briefly: mental models and experience lifecycles.

Indi Young’s recent book, Mental Models: Aligning Design Strategy with Human Behavior, provides a very good introduction to the research, analysis, and communication of mental models. They provide an excellent way of understanding how users approach the context for which we’re designing a product. Figure 1 shows an example of a mental model from the book.

Figure 1—A mental model—from Indi Young’s Mental Models, published by Rosenfeld Media
Mental model

Experience lifecycle is a generic term that represents the start-to-finish series of interactions a customer has with an organization. For example, LEGO uses an experience wheel like that shown on Customer Experience Matters, which depicts the end-to-end experience of a frequent flyer traveling to New York from London.


We can aggregate and synthesize user research—of many shapes and sizes—to form audience segmentations that encapsulate sets of characteristics, needs, and behaviors. Then, analyzing the same research, we can produce personas, mental models, experience lifecycles, or other user-modeling documentation to inform and enrich the design of our products and services and, ultimately, help us deliver better designs. 

I’d like to thank those in the UX community who have shared their insights on this topic with me—in particular, Donna Spencer, Kaleem Khan, Tyesha Snow, Lembit Kivisik, Ruth Ellison, and Todd Zaki Warfel.

Co-founder & Principal at Meld Studios

Sydney, New South Wales, Australia

Steve BatyFocusing on the business side of the user experience equation, Steve has over 14 years of experience as a UX design and strategy practitioner, working on Web sites and Web applications. Leading teams of user experience designers, information architects, interaction designers, and usability specialists, Steve integrates user and business imperatives into balanced user experience strategies for corporate, not-for-profit, and government clients. He holds Masters degrees in Electronic Commerce and Business Administration from Australia’s Macquarie University (MGSM), and a strong focus on defining and meeting business objectives carries through all his work. He also holds a Bachelor of Science in Applied Statistics, which provides a strong analytical foundation that he further developed through his studies in archaeology. Steve is VP of the Interaction Design Association (IxDA), a member of IA Institute and UPA, founder of the UX Book Club initiative, Co-Chair of of UX Australia, and an editor and contributor for Johnny Holland.  Read More

Other Columns by Steve Baty

Other Articles on User-Centered Design

New on UXmatters