Someone recently asked me to provide recollections of my earliest experiences with usability testing. This took me back to around 1997, when as part of a research project, I analyzed the use of a then new Web-based library catalogue system, conducted user interviews, and redesigned the system according to the resulting findings. While this sounds straightforward now, with Google Analytics and today’s online survey tools, back then it necessitated writing raw HTML and Perl to capture data and C code to parse and analyze log-file and survey data, then mocking up alternative designs in HTML. Today, 20 years on, our expectations of software have changed radically. Fortunately, so have the tools at our disposal for designing and testing software.
For me, being able to conduct usability testing remotely is one of the biggest developments of the last 20 years. Add the gig economy, fast networks, and screen recording, and we’ve set the stage for being able to get low-cost, high-volume feedback on our software, in a way that complements our ability to rapidly prototype and do iterative, agile development.
What Does Whatusersdo Offer?
Recently, I needed to research and find a remote usability–testing service and came across whatusersdo. The service enables you to conduct testing on desktops, smartphones, or tablets.
A big driver for my using whatusersdo was its cost: the initial barrier to entry is very low relative to its competitors—one of whom wanted around $6,000 just to get started. Though, of course, pricing models for remote usability–testing providers differ. Fortunately, whatusersdo offers a free trial, which let me obtain some worthwhile data and evaluate the service before signing up. When I did sign up, the cost was on the order of $35 per test.
Working with Reviewers
There’s a lot of flexibility in selecting your target audience. Typically, you state the number of participants you require, then profile them by country, gender, age range, and socioeconomic group. The service also offers instant audiences, which are predefined groups of users with labels such as Digital Naturals and Online Shoppers, which you can further define by country and gender. Over 30,000 reviewers are registered with whatusersdo, across the USA and Europe, and they’re curated so each of them participates in only a few tests per month to avoid their becoming professional testers.
For my usability study, I needed to test a process that was quite detailed. So I tried a couple of different approaches to providing instructions to the reviewers and found that they were most successful when they were able to see and complete a series of individual tasks. However, since the system offers quite a lot of flexibility in how you can present instructions to reviewers, experimenting with different approaches is a worthwhile endeavor.
Conducting Test Sessions
You can either launch a usability-test session immediately or schedule it to begin at a particular time. It is clear that, at least in part, the design thinking behind whatusersdo is to help embed usability testing into a company’s culture, and this is a laudable goal. In my experience, the length of time it takes to run usability tests varies, but the time it takes to conduct ten test sessions is typically on the order of 24–48 hours. For my study, each test-session video was around 20 minutes long.
Reviewing Session Videos
If you’ve never reviewed the videos for remote usability–test sessions, here’s a taste of what to expect. Some of the reviewers were absolute stars, really going the extra mile to give detailed feedback on the user journey. A few reviewers—thankfully in the absolute minority—were less helpful and failed to follow the instructions.
However, the positive is that it’s possible to give feedback on reviewers and, if they fail to follow your instructions, you can redo the test with a different reviewer. It’s also possible to reject reviews for other reasons—such as poor audio quality—but fortunately, I’ve not had to do that. It’s worth noting that, given the detailed instructions I occasionally needed to provide, my rejection rate of reviewers was possibly higher than the average! When I’ve needed to redo sessions, the replacement reviewers have become available very quickly, and I’ve never had a problem with a replacement reviewer.
The biggest problem I’ve had when reviewing test-session videos has been my frustration with users. As a UX designer, it can feel heretical to say this, but sometimes users do dumb things. When you’ve spent several hours reviewing user interactions, listening to comments, and—most frustrating of all—listening to participants read out screen copy and instructions, you can get a little jaded, so it’s best to pace yourself. As you watch videos, you can timestamp the start and end of interesting interactions and make annotations.
Feedback on Whatusersdo
It’s clear that the service is designed more for non-specialists than for UX professionals. While it is possible to export the raw data for a set of usability tests, this capability is a little bit hidden, in favor of a PDF report in a more traditional format. A recent change to the service is its prioritization of charts that show task-completion rates, time on task, and perceived ease of use over your being able to get straight into the data. While I appreciate the value this would provide to a non-specialist, as someone who wants to get down and dirty with the data, this is a small barrier. That said, I do approve of the design thinking behind it: to prioritize design decisions in favor of the non-specialist rather than the specialist!
Whatusersdo have been pretty good about both requesting and listening to my feedback. Each customer has an account manager who responds to requests. Mine invited me to provide direct feedback via a Skype call, during which I raised several points, and they acted on them very shortly afterward. While this may have been a happy coincidence, being listened to still gave me a sense of satisfaction, so I’m one happy customer!
Whatusersdo is not a perfect tool. It would be great to have the ability to pick a particular social group rather than ABC1 or C2DE. I’d really like to be able to speed up the playback of videos, while still being able to listen to the audio. I think this would let me process the reviewers feedback much more quickly—listening just for verbal feedback and skipping the instructions. However, I’ve been very favorably impressed with whatusersdo. It provides a cost-effective, reliable approach to getting detailed user feedback.
Peter has been actively involved in Web design and development since 1993, working in the defense and telecommunications industries; designing a number of interactive, Web-based systems; and advising on usability. He has also worked in education, in both industry and academia, designing and delivering both classroom-based and online training. Peter is a Director at Edgerton Riley, which provides UX consultancy and research to technology firms. Peter has a PhD in software component reuse and a Bachelors degree in human factors, both from Loughborough University, in Leicestershire, UK. He has presented at international conferences and written about reuse, eLearning, and organizational design. Read More