Asking Questions About Internet Behavior

By Caroline Jarrett

Published: February 7, 2011

“Steve Krug’s newest book … inspired me to think again about my whole approach to usability testing.”

Have you read Steve Krug’s newest book, Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems? I was honored when Steve asked me to read it in manuscript form, but—just between you and me—I didn’t expect to learn all that much, because I’ve been practicing and teaching usability testing for more than 15 years. Well, I was completely wrong: the book inspired me to think again about my whole approach to usability testing. A few examples of what made me think:

  • testing far more often—maybe monthly rather than twice a year
  • testing with just three participants rather than my usual five to ten
  • forgoing a written report in favor of a post-session debrief meeting

Having said all of that, there was one point in the manuscript that I just couldn’t agree with. It was where he was describing how I should run my first usability test. Steve told me to use an interview script—no problem with that. The script he recommended had a couple of background questions about the participant—okay, fine, I expected to do that. But what was this: “How many hours a week do you spend on the Internet?” No, no, no, no. Or putting that another way: No!

We Rarely Know How Long We Spend on Habitual Behaviors

“Most people … have no idea how much time they spend on … habitual behaviors.”

My objection to the question is that it’s asking something most people couldn’t answer accurately—or even confidently, but inaccurately—because they have no idea how much time they spend on such habitual behaviors. In general, we don't know how long we spend on routine or habitual behaviors unless they follow a predefined schedule or we make an effort to log them when they occur. Some examples:

  • routine, scheduled behaviors—taking medications at the same times every day; going to an hour-long class twice a week
  • behaviors we log when they occur—keeping track of hours worked in a timesheet program; paying for taxis on a business trip

And let’s be realistic here: Some of us are conscientious and orderly and meticulously note the hours we work or travelling expenses we’ve incurred just moments after we know what they are. But others of us must admit that we do occasionally leave it for a bit—like, oh, a month or so. And then we’re never quite sure we’ve done it as accurately as we should.

What Is the Purpose of Steve’s Question?

“Show participants that you’re going to be listening to what they say.”—Steve Krug

So, let’s return to that question in Steve’s script. He explained to me—and wrote in the final book—that the question isn’t really there because he’s interested in the number of hours a participant spends on the Internet. It’s there to do three things:

  • “Get the participants comfortable talking.
  • “Show participants that you’re going to be listening to what they say.
  • “Get the information you need to grade on a curve.”

Not sure about what “grade on a curve” means? Steve wants to get a general sense of how Web-savvy and computer-savvy a participant might be, by listening to how they talk about their experience.

Well, that seemed fine to me, so I withdrew my objection.

What If We Really Need to Know About Internet Experience?

“Steve’s book is about very-small-sample, rapid-iteration usability tests, with no reports and minimal bureaucracy. … They’re great, but they’re not always what I can persuade a client to let me do or even appropriate for every project.”

This still left me with a nagging question, though. Steve’s book is about very-small-sample, rapid-iteration usability tests, with no reports and minimal bureaucracy. His book convinced me to try them, and they’re great, but they’re not always what I can persuade a client to let me do or even appropriate for every project. For example, in a test I’m doing this week, I’m comparing two versions of a product, and we’ve decided we really need 5 participants per version.

So, what are we to do if we really need to get some sort of handle on Internet experience? Perhaps for comparison across usability test sessions or for measuring progress in some way?

It occurred to me that it would be good to think about my father, who would be the first to admit he’s not exactly a computer wizard. He had never touched a computer keyboard until he was 70, but has learned enough to write his academic articles and buy things online from time to time. He relies on others for anything that’s even slightly complex. What question would correctly identify him as a non-expert?

Another Question About Internet Expertise

Whitney Quesenbery and I have struggled with how to ask about people’s Internet expertise during our work for the Open University. We’ve done a fair number of usability studies and other user research for them, and we’ve needed to keep track of participants’ approximate level of experience. Here is the question we have used:

“I’d like to ask you about your use of the Internet. Please choose one of these options:

  • It’s part of everyday life.
  • I use it when I need it—two or three times a week.
  • I’m an occasional or a new user.
  • I prefer to avoid the computer altogether or let someone else use it for me.”

Now, if you’re a purist questionnaire designer, you’re probably yelling at me: “But you’ve merged answer categories! Not a good idea!” Which is sort of true. But we found that, in practice, it’s what we did anyway during analysis. We also found that our participants easily understood these categories. And it was helpful for us to have fewer categories, because we always asked the question face to face rather than giving participants a written questionnaire.

But here’s the bad news: I called Dad and asked him the question. He was firm: “It’s part of everyday life.” Oops, that question wouldn’t work to identify him as a non-expert.

An Older Suggestion That Does Not Work Well

Time to put the books on my shelf to good use. For a historic perspective, I went back to the golden oldie of usability testing books as well: A Practical Guide to Usability Testing by Joe Dumas and Ginny Redish. In their generous way, they would both suggest our reading other books these days, but there’s often a worthwhile perspective in there to think about. That book is pre-Internet, but their questions were:

  • “How long have you been using personal computers?
  • “How often do you use a personal computer?”

I’m sure those questions worked perfectly in the early 1990s, and I suspect they are the indirect source of the question I didn’t like in Steve Krug’s script. But times have changed. My father has been using computers for 10 years now, and he’s been known to spend hours hunting around on the Internet for scholarly materials. Does that make him an expert user? No. An expert in his field, unquestionably. Confident on the Web sites that are relevant to him, sure. Expert with computers, definitely not.

Is a Usability Test the Right Time to Ask This Question?

If you know participants’ expertise is important to you, it may be a bit too late to find out about it when they’re actually in the room with you, and you’re about to start testing.

So what do the current textbooks tell us? I checked my two favorites, Carol Barnum’s new book Usability Testing Essentials: Ready, Set Test! and Dana Chisnell and Jeffrey Rubin’s classic Handbook of Usability Testing.

At first, I was a bit surprised to find that neither of them discusses asking about Internet or computer use at the start of a test session—either in the pre-test questionnaire or in the interview script itself. So I dug a bit deeper, and it turns out both of them deal with this sort of topic at the recruiting stage. That makes a lot of sense: if you know participants’ expertise is important to you, it may be a bit too late to find out about it when they’re actually in the room with you, and you’re about to start testing.

Some Other Ways of Tackling This Problem

Amanda Nance at Sage faced the challenge of identifying beginners versus advanced computer users. The team brainstormed and settled on these questions:

  • “Do you install new software yourself, or do you have someone else do it for you?
  • Do you use keyboard shortcuts?”

And I can confirm that these questions would work with my dad. He definitely doesn’t install new software, and he doesn’t know what a keyboard shortcut is.

Danielle Gilbert Cooley, UX Director at 4ORCE Digital, suggests asking about domain-specific expertise instead. She uses this example: “If you’re testing something in the financial-services domain, ask about trading stocks or paying bills online rather than just checking balances.”

If you’re really asking such questions just to get participants used to the idea of thinking aloud and chatting with you, what about using a completely different question? I found the next two questions in my favorite reference on questionnaire design: Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, by Don Dillman, Jolene Smyth, and Leah Christian. When they want to get a participant started thinking aloud, they use these two questions:

  • “How many residences have you lived in since you were born?
  • How many windows are there in your home?”

Summary

“Do you need to ask this kind of question at all? What about simply observing participants during usability testing and making your own judgment about their expertise?”

Have another look at your introductory test script. Challenge yourself:

  • Is this the right place to ask about Internet expertise? It might be better to ask this question at an earlier stage.
  • What am I achieving by asking this question? If it’s just getting participants talking, it might be better to ask about something else.
  • Do I need to know about Internet expertise in general or something more specific? It might be better to ask about something a participant does that is relevant to the type of testing you’re doing.

And finally, do you need to ask this kind of question at all? What about simply observing participants during usability testing and making your own judgment about their expertise?

Acknowledgment—Thanks to Whitney Quesenbery for suggesting the topic for this column.

References

Barnum, Carol M. Usability Testing Essentials: Ready, Set…Test! Burlington, MA: Morgan Kaufmann, 2010.

Dillman, Don A., Jolene D. Smyth, and Leah Melani Christian. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. Hoboken, NJ: Wiley, 2009.

Dumas, Joseph S., and Janice C. Redish. A Practical Guide to Usability Testing. Revised ed. Exeter, UK: Intellect Books, 1994.

Krug, Steve. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. Berkeley, CA: New Riders, 2010.

Rubin, Jeff, and Dana Chisnell. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. 2nd ed. Indianapolis, IN: Wiley, 2008.

2 Comments

Thanks for the good suggestions on how to approach this.

I’m used to asking a version of this question as well. I agree with your last point—it almost always becomes apparent what the participant’s level of Internet expertise is by the end of the session.

Sometimes you’ll get people who say they’re very comfortable with the Internet or use the computer all the time, and absolutely flounder at basic things like managing tabbed windows in certain browsers or using the Back button. Or you’ll find someone modest who claims moderate proficiency, but wings through the session and ends up teaching you new keyboard shortcuts and the like.

In general, I think there’s a bit of a stigma with suggesting that you’re not tech savvy, and people are a little self-conscious about the topic. I guess, overall, it just sort of highlights the inaccuracy of self-reported measures.

Thanks for the wonderful suggestions.

I’m presently conducting a survey to analyze the book-buying behavior in tier-2 and tier-3 cities for an online retailer. It would be really helpful if you could give me a few tips on the ways to approach the problem.

Join the Discussion

Asterisks (*) indicate required information.