Using Preview Releases to Gain Insights on Technical Products

Enterprise UX

Designing experiences for people at work

A column by Jonathan Walter and Katie Groh
August 22, 2022

Have you ever been in a situation where you’ve found it difficult to engage with certain users or the type of feedback you’re receiving isn’t at the level you require? You’re not alone! At Rockwell Automation, we have a very technical group of users—engineering types—from whom we need to gather feedback. These users typically know a great deal about our systems and have very specific feedback for us about our products and the new concepts we’re developing. As one of our customers put it, “I know the product inside and out—most of time better than you do at Rockwell.”

As you can imagine, the occasional usability study or survey wouldn’t cut it for such technically advanced users, who frequently work with our highly technical, feature-dense, sovereign-posture applications. This compelled one of our company’s leaders to state, “We need more at bats.” So how can we get more regular access to these users and cycle their feedback into the development of a large-scale product? We leverage a Preview Release program. In this column, we’ll walk you through what a Preview Release is and how you could use one—which might not be as difficult as you think.

Champion Advertisement
Continue Reading…

What Is a Preview Release?

A Preview Release is a program that enables teams to consistently engage customers for feedback on the current software build, as shown in Figure 1. The benefits of a Preview Release is that it lets teams engage with customers early and often—well before a product is ready for production.

Figure 1—Definition of a Preview Release program
Definition of a Preview Release

Learning from a Preview Release leverages several usability and customer-satisfaction methods, which, together, create a program that successfully engages participants and gathers their feedback to help inform the design of the product going forward. We use a combination of task-based studies, surveys, hands-on labs, voice-of-the-customer (VoC) sessions, and user interviews to ensure that we understand our users.

A Preview Release differs from other mixed-method approaches, as well as from beta testing, because we let our users have early access to the system, within a controlled environment that we monitor carefully. Plus, we gather feedback at a specific cadence.

How Should You Start a Preview Release Process?

A Preview Release process comprises several steps, so it’s important to start with a core group of people who can focus on completing them. Often, including too many colleagues up front makes ownership ambiguous, so the responsibilities and functions within a group become difficult to assign. We solved this problem by working with the program’s leadership to show the key areas in which people were contributing, then laying out a business case that demonstrated how having someone focusing on this program as their full-time job could make communication, customer engagement, and the overall process much smoother and more efficient. Thus, we now have a dedicated product manager and certain UX researchers who create all the documentation and lead the program’s sessions.

Once we identified that core group and its members accepted their roles in the program, we worked with our customer-account and sales teams to determine our customer set. Because our users are very technical, so harder to reach, we can’t use the typical research panel. They can’t provide users who have the depth of understanding of our products that is necessary for our research. Our partnership with the account and sales teams has been successful since we armed them with specific personas that represent the users and the customers we need. Using the personas, our partners can begin to identify the appropriate individuals within the organizations that they support.

Once we’ve identified those customers and users, we must sign their organizations’ nondisclosure agreements (NDAs) and have them sign ours. Because confidential information flows in both directions, this is important to all parties. We want to ensure that we’re keeping their data confidential and vice versa.

Figure 2 shows the initial steps of the Preview Release process, which involve creating a core group, identifying customers, and having them sign NDAs.

Figure 2—The initial steps of the Preview Release process
The initial steps of the Preview Release process

The Preview Release Process

You’ve gotten buy-in and have created a dedicated Preview Release Team. Check. You’ve identified your customers with the support of your sales and account teams. Check. You’ve obtained signed NDAs from these customers, and they’ve signed yours. Check. Now, what does this process involve?

Over time, our Preview Release process has matured, and we’ve made some tweaks and modifications along the way. At Rockwell, our Preview Release process comprises the following steps, as shown in Figure 3.

  1. Working with the product team
  2. Creating the materials
  3. Reviewing the materials with stakeholders
  4. Scheduling participants
  5. Facilitating sessions
  6. Collecting survey feedback
  7. Analyzing the results in Dovetail
  8. Presenting the results
  9. Entering the data into Jira
Figure 3—The steps of our Preview Release process
The steps of our Preview Release process

Step 1: Working with the Product Team

We begin each Preview Release at the end of a Program Increment (PI), which is an agile framework that is useful in large-scale software development. As a consequence, we conduct a Preview Release roughly every three months. Working with our product teams, which comprise product managers, UX designers, and developers, we kick off a Preview Release process by endeavoring to understand the key learning objectives from the previous PI. These are usually the top areas of focus for development, and we want to better understand users’ opinions so we can evaluate the usability of the product’s features.

Step 2: Creating the Materials

Once we’ve identified the learning objectives, we create our materials for participants. These include a hands-on lab document, a task-based study script, a survey, and if necessary, any interview scripts or focus-group documentation. The hands-on lab document describes, step by step, what the features are and how to use them and provides questions that the researchers want the participants to think about while using the new features. Researchers use the task-based study script during their usability study, following the five to seven key tasks that participants need to complete. It also provides a convenient medium for listing follow-up questions or topics to probe more deeply. The survey includes questions about the product’s value and ease of use and the user’s productivity and leaves plenty of room for comments that participants might want to share.

Step 3: Reviewing the Materials with Stakeholders

Once we’ve created the materials, we work with the product team to make sure that the questions we are asking can provide the answers they need regarding our key learning objectives. At this point, we typically conduct a dry run of our task-based study with an internal participant to make sure that the research runs smoothly when we are working with our customers.

Step 4: Scheduling Participants

Next, we schedule time with each of the participants. Because participants are getting access to our build, which is not generally available, we must make sure that the environment is set up properly so they can use the build during a specific timeframe. We typically use Calendly and allow participants to schedule a four-hour time block during a two-to-three-week timespan. This gives participants plenty of time to go through the hands-on lab, answer the survey questions, and ask us any questions via email, as necessary.

Step 5: Facilitating Sessions

Once we’ve scheduled all the sessions, we select about ten participants for each timeslot, then walk them through the task-based study for the first hour of their scheduled session, before giving them the rest of the documentation and the hands-on lab. This lets us gather some information from participants without the step-by-step guide biasing them regarding how it should work.

Step 6: Analyzing the Results in Dovetail

Once all participants have completed their surveys, and we’ve completed our task-based studies, we analyze the results. Analysis typically takes a week or two because of the volume of responses we get. We use Dovetail to collect all our recordings, transcripts, notes, and other information. In Dovetail, we can tag our insights, create reports, and share our findings among our teams.

Step 7: Presenting the Results

Because the findings from our Preview Release program are so important to our teams, we typically plan a few different presentations for sharing these results. For teams who are directly responsible for the features we’ve tested, we conduct an in-depth walkthrough of the results in Dovetail. This includes our showing videos of the participants’ tasks, the overall scoring, quotations from participants, and more. We also present our high-level findings to outside stakeholders to help them understand what customers are saying and what things we’re focusing on within the product. Finally, because leadership is also very interested in the results from these Preview Releases, we typically schedule a high-level readout with them as well, presenting the overall scoring, the areas that are working well, and those that need improvement.

Step 8: Entering Data into Jira

Once all of these groups understand the results, we work with the product team to prioritize the findings. Then we enter the prioritized findings into Jira as tickets in the backlog. This enables each team to work on the priorities in the proper order, beginning with the fixes and improvements that would help make the next Preview Release and Program Increment a success.

Final Thoughts

Our Preview Release program not only fosters great customer engagement and enables us to improve the products but positively impacts our product teams as well. Product teams find the information that we gather to be extremely valuable, and they are starting to better understand the need for UX research even earlier in the product-development lifecycle. We are also seeing more requests for our regular research programs—a good problem to have! All of this adds up to creating better products, so our users and customers ultimately benefit.

We hope we’ve provided some inspiration that could help you address challenges you may have encountered in your research or design activities. Keep in mind that, if certain more conventional methods aren’t working out for you, you could also create your own approaches. As UX researchers, the most important part of our job is understanding our users. If there are better ways to engage with them outside our typical usability studies and surveys, it’s important to try them out! 

Note—Katie Groh presented on this topic at UXRConf 2022. Check out the presentation video for a more in-depth look at how this method has benefited teams at Rockwell!

Director of User Experience at Rockwell Automation

Cleveland, Ohio, USA

Jonathan WalterJon has a degree in Visual Communication Design from the University of Dayton, as well as experience in Web development, interaction design, user interface design, user research, and copywriting. He spent eight years at Progressive Insurance, where his design and development skills helped shape the #1 insurance Web site in the country, Jon’s passion for user experience fueled his desire to make it his full-time profession. Jon joined Rockwell Automation in 2013, where he designs software products for some of the most challenging environments in the world. Jon became User Experience Team Lead at Rockwell in 2020, balancing design work with managing a cross-functional team of UX professionals, then became a full-time User Experience Manager in 2021. In 2022, Jon was promoted to Director of User Experience at Rockwell.  Read More

User Experience Research Practice Lead at Rockwell Automation

Milwaukee, Wisconsin, USA

Katie GrohKatie leads the software UX Research practice at Rockwell Automation, ensuring that UX research is a key driver in the development of Rockwell’s industrial-automation software. Previously, she was a UX Coordinator at Epic Systems, where her focus was on tailoring a Usability Toolkit for their electronic health-record applications. Katie has her Masters in Industrial Engineering - Human Factors and Health Systems from the University of Wisconsin Madison. She has a passion for user experience and always looks forward to exchanging new ideas with some of the best UX professionals.  Read More

Other Columns by Jonathan Walter

Other Articles on UX Research

New on UXmatters