Top

Tips for Conducting Remote UX Research and Testing, Part 1

Ask UXmatters

Get expert answers

A column by Janet M. Six
October 19, 2020

This month in Ask UXmatters, our expert panel provides some helpful tips about conducting effective remote UX research and usability testing.

As COVID-19 has forced UX researchers to conduct research and testing remotely, both participants and the teams conducting remote research and testing have had to learn to deal with new testing tools and conditions. There are multiple factors to consider in transitioning to remote UX research, including the following:

  • making remote research and testing possible
  • making remote research and testing effective
  • understanding the bias that remote research and testing introduces because participants are likely more technically sophisticated
Champion Advertisement
Continue Reading…

Every month in my column Ask UXmatters, our panel of UX experts answers readers’ questions about a broad range of user experience matters. To get answers to your own questions about UX strategy, design, user research, or any other topic of interest to UX professionals in an upcoming edition of Ask UXmatters, please send your questions to: [email protected].

The following experts have contributed answers to this month’s edition of Ask UXmatters:

  • Carol Barnum—Director of User Research and Founding Partner at UX Firm; author of Usability Testing Essentials: Ready, Set … Test!
  • Pabini Gabriel-Petit—Principal Consultant at Strategic UX; Publisher, Editor in Chief, and columnist at UXmatters; Founding Director of Interaction Design Association (IxDA)
  • Gavin Lew—Managing Partner at Bold Insight

Q: Do you have any tips for doing remote UX research or testing?—from a UXmatters reader

“With everyone doing remote UX research in these days of COVID-19, you might think there would be no need for tips on how to do it well,” remarks Carol. “But doing remote research and doing it well are two different things. Here are some tips for doing remote, moderated usability testing well.

  • Establish a welcoming atmosphere. This is even more important for remote testing than for in-person testing. Because of the remote context, you need to reach across the distance to make participants feel welcome, show your appreciation for their taking the time to help you understand their experience, and demonstrate that you are interested in learning from them. If your participants’ Webcam is turned off, ask whether they are okay with turning it on. If not, let it go.
  • Confirm that participants are prepared for their session. Have they downloaded the software for the session? Are they okay with sharing their screen? Do they know how to do that? Have they read the materials you sent them in advance, including their providing permission for you to record the session? Reconfirm their permission to record. Start the recording once they say it’s okay. If, for any reason, participants say it’s not okay to record their session, ask whether you can record just the audio. If they agree, they should turn off their camera.
  • Be prepared for distractions. For example, participants’ mobile phone might ring or notify them that they have received a text or an alert; their computer could display email alerts or notifications—when they are sharing their screen; someone could make a delivery at their front door; or the presence of or interruptions by children or pets could disrupt their session. If possible, ask participants whether they are willing to turn off their computer or mobile-phone notifications, phone calls, and prevent any other interruptions. Plus, if you are sharing your screen, be sure to turn off your own email and other notifications and switch your mobile phone to silent.
  • Be prepared for technical problems. Have a plan for addressing them. Sometimes, participants might lose their connection to the meeting. In advance, tell them that, if there are any technical difficulties, they should log out of the meeting, then log back in. Or you could tell them that, if either of you experience technical difficulties, you will send them a text or email message, informing them of what to do. If there are Wi-Fi or bandwidth issues causing difficulty for participants, they can turn off their Webcam to see whether that helps.
  • Inform participants of the presence of any observers in a session. If you plan to allow observers to ask questions at the end of a session, let the participant know that this may happen. Also, let observers know what you expect of them. For instance, are you muting everyone who observes or asking all observers to mute themselves? If you want observers to make suggestions or give you questions to ask participants during the task part of the session, tell them how they should communicate with you. Should they use the chat feature of the online-meeting software to send you a private message? Or do you want them to text you their questions during the session? If you’re allowing them to ask questions directly of the participant at the end of the session, how should they indicate that they have a question? Will you watch the list of observers to see whether they unmute themselves or instruct observers to raise their hand if they have a question?
  • Conclude your sessions by thanking participants again for their time. Also thank them for the valuable insights you and your team have gained by observing them. Remind them how they’ll receive their stipend. If you’re using a recruiting company that will handle the stipends, let participants know that they’ll be hearing about their stipend from that company. If you’ll be handling the payment of stipends yourself, Amazon gift cards are an easy way of arranging payment. Let participants know they’ll receive an email message from Amazon with a link to their gift card.
  • Ask participants to click the Leave button to exit the session. If you plan to stay in the meeting to debrief your project team or get ready for the next session, let participants know that you plan to stay in the room after they leave. Then you can either stop the recording or, if you want to capture your team’s conversation during the debrief, keep recording on.”
Champion Advertisement
Continue Reading…

Understanding Technology’s Impacts on Remote UX Research and Testing

“While using the participant panels that are associated with some remote-testing tools is both efficient and cheap, all too often, such panelists tend to be technology savvy,” answers Gavin. “This concern is particularly relevant for foundational research and formative usability testing. When you’re employing these methods, consider using more traditional recruiting methods—such as those you would use for in-person, lab research.

“Schedule pre-session technical checks with participants, including testing the remote software—making sure it has downloaded—their broadband speed, and audio and video quality. While you’ll typically focus on video and screen shots, poor audio is often more frustrating to observers on your product team. These technical checks could take between 15 and 30 minutes, depending on whether you encounter any problems. But they’re worth doing to ensure that your sessions go well.”

Remote UX Research Resources on UXmatters

“COVID-19 has many in the UX community thinking along the same lines,” responds Pabini. “Since the pandemic began, we’ve published several articles that address conducting remote UX research.

“For his great column Practical Usability, Jim Ross has already contributed several editions about remote UX research, most of them during this first year of the pandemic, including the following:

“And, of course, we’ve published other excellent, pre-COVID-19 columns and articles about remote UX research, such as the following:

“Finally, to learn how to ensure participants’ safety when returning to conducting face-to-face UX research, read Jason Stockwell’s article ‘When and How to Resume Face-to-Face Research After COVID-19.’

“I hope these references help you to make the most of your UX research efforts during the pandemic and in the after times.” 

Product Manager at Tom Sawyer Software

Dallas/Fort Worth, Texas, USA

Janet M. SixDr. Janet M. Six helps companies design easier-to-use products within their financial, time, and technical constraints. For her research in information visualization, Janet was awarded the University of Texas at Dallas Jonsson School of Engineering Computer Science Dissertation of the Year Award. She was also awarded the prestigious IEEE Dallas Section 2003 Outstanding Young Engineer Award. Her work has appeared in the Journal of Graph Algorithms and Applications and the Kluwer International Series in Engineering and Computer Science. The proceedings of conferences on Graph Drawing, Information Visualization, and Algorithm Engineering and Experiments have also included the results of her research. Janet is the Managing Editor of UXmatters.  Read More

Other Columns by Janet M. Six

Other Articles on User Research

New on UXmatters