During the COVID-19 pandemic, most UX researchers have shifted to conducting all of their UX research and usability testing remotely. Last month, in the first part this two-part Ask UXmatters series, “Tips for Conducting Remote UX Research and Testing, Part 1,” our expert panel described several factors to consider in transitioning to remote UX research and offered some other helpful tips on conducting effective remote research and testing.
This month in Ask UXmatters, our expert panel provides some additional valuable tips for conducting effective remote research and testing. Our experts’ recommendations include recruiting participants who have the right equipment, but considering how their technology might impact your findings; tips for remote moderation; how to avoid technology pitfalls; and getting your stakeholders up to speed on your new remote-research techniques. In this column, one of our experts also provides links to some helpful online resources that provide tips specifically about conducting remote research and testing during the pandemic.
In my monthly column Ask UXmatters, our panel of UX experts answers readers’ questions about a broad range of user experience matters. To get answers to your own questions about UX strategy, design, user research, or any other topic of interest to UX professionals in an upcoming edition of Ask UXmatters, please send your questions to: [email protected].
The following experts have contributed answers to this month’s edition of Ask UXmatters:
Michael Faletti—Senior UX Designer and Researcher at Saggezza
Caroline Jarrett—Owner and Director at Effortmark Limited; UXmatters columnist
Cory Lebson—Principal Consultant at Lebsontech; Past President, User Experience Professionals’ Association (UXPA); author of The UX Careers Handbook
Q: Do you have any tips for doing remote UX research or testing?—from a UXmatters reader
Cory offers several tips that support successful remote research and testing:
“Recruit participants who have the appropriate technological infrastructure to handle a remote session—particularly if they must share their screen with you. This requires decent technology and good bandwidth.
“While recruiting participants who have a microphone and a Webcam usually works fine, have a backup plan that includes using a dial-in number in case their technology doesn’t work.
“Assess whether your sample is at all skewed toward more technically savvy people because of your need to recruit participants who can do a remote session. It’s okay if you have to skew in this direction to do your research. However, you should keep this possible bias in mind when you’re analyzing your data and add the necessary caveats to your findings.
“If you must present tasks to your participants verbally and they are at all complex, consider using a shared Google document in which you can post tasks and activities in real time. Perhaps participants could open this document on a mobile device or second computer screen if they have one.
“Allow extra session time to get participants fully set up and connected—especially if you’re dealing with shared mobile screens, for which there often seem to be more screen-sharing hurdles than on desktop or notebook computers.”
Avoiding Technology Pitfalls
“I want to share just a couple of points to focus on when you’re running remote usability-testing sessions,” answers Michael. “A couple of key factors have helped our teams in initially setting up test sessions and ensuring a smooth process for participants. To prepare your team to encounter technology issues, begin with the mindset that the technology running your remote concept test, usability test, or interview is probably going to fail at some point. This way, you’ll come prepared and have backup plans in place for any given situation that arises. One way to help ensure smooth sessions for your participants is to run connectivity tests a few days before the actual sessions.
“Let’s consider some examples of technology-alignment failures that I’ve experienced during remote research sessions, which could involve both your technology and that of your participants. What if the application you’re using to communicate with the participant fails? Or the recording of a session—assuming the participant permits recording—does not catch the sound or video or fails altogether. These are things you can control, so you should have a backup plan at the ready. WebEx does a great job of recording sessions and organizing its video files and delivers a transcript after the session as well. But I would still suggest having QuickTime running in the background to provide a backup recording.
“Firewalls can cause some technologies not to work together and ultimately destroy a session with what was a willing participant. Before each remote session, run a connectivity test with the participant. This should not take any longer than ten to fifteen minutes. Ask participants to conduct their connectivity test in the same place in which they’ll be participating in the actual session, using the same machine whenever possible. This can help you gauge the quality of their Internet connection, as well as identify any firewall issues. Ensure that you and the participant can hear each other, the applications are syncing, and anything else you might need to cover from a technology perspective for a given session. In my experience, participants are happy to go through this process because it shows we care about their time, appreciate their expertise, and want to ensure a smooth session.
Communicating Changes in Your Research Techniques to Stakeholders
“When you’re preparing your research plan, take into account that you’ll probably have stakeholders from the business or product management who want to join your remote sessions,” continues Michael. “This can be a great step upward in your company’s overall UX maturity level. But how can you ensure that the sessions stay on track? The solution: always have a plan in place for educating stakeholders about how to behave when observing a research session. Let them know that interrupting a call with additional commentary could derail the session. Having people other than the participant and the moderator talking could potentially confuse the participant and throw off the flow of the conversation. Our UX team touches base with any stakeholders who might join the sessions to explain to them clearly what activities the sessions comprise, the role of the moderator, your overall research goals, and of course, what others in the session should and should not do.”
Great Online Resources
Caroline recommends some online resources that provide really useful information, as follows:
“Although the latter post is a few months old and, for now, we’re less locked down than we were, I think it’s still relevant. The post covers topics that many of us must continue to think about—such as taking care of your own health and your colleagues’ health, as well as respecting the challenges that your participants might be facing.”
Dr. Janet M. Six helps companies design easier-to-use products within their financial, time, and technical constraints. For her research in information visualization, Janet was awarded the University of Texas at Dallas Jonsson School of Engineering Computer Science Dissertation of the Year Award. She was also awarded the prestigious IEEE Dallas Section 2003 Outstanding Young Engineer Award. Her work has appeared in the Journal of Graph Algorithms and Applications and the Kluwer International Series in Engineering and Computer Science. The proceedings of conferences on Graph Drawing, Information Visualization, and Algorithm Engineering and Experiments have also included the results of her research. Janet is the Managing Editor of UXmatters. Read More