Five Best Practices for Becoming a Data-Driven Design Organization, Part 2
Published: March 21, 2016
In Part 1 of this three-part series, I shared two best practices for developing a data-driven design organization. I described how initiating a customer journey–mapping process and having on-going discussions about the customer journey let our organization build an image of the future. I also discussed moderated usability testing of early designs, which let us focus on the short term.
My instincts told me that I had to demonstrate value immediately, but also needed to lay the foundation for the long term. Building credibility required us to be responsive to current goals and requests, while at the same time projecting where we could discover deeper insights in the service of long-term business aspirations. In other words, I needed to be a kind of seer and make educated assumptions about what kinds of questions would surface answers or information that would be valuable to the business. I had to form questions that the business didn’t yet know to ask.
Now, in Part 2, I’ll offer two additional best practices that balance long-term and short-term benefits. I’ll describe two more research activities that provided data to support our design decisions, increased the organization’s shared, customer centric–mental model, and established the credibility of our team as experience design professionals.
Best Practice #3: Competitive Benchmarking
Competitive benchmarking lets you evaluate how a product or service stacks up against its current competitors. Depending on the goals for the benchmarking study, the companies you compare can be in either the same or a similar industry. In our case, we focused on current best practices for SaaS / ecommerce sites. Our purpose was twofold:
- To educate the organization on what elements are critical to a successful, competitive ecommerce business—We created a glossary that documented what became a shared vocabulary for the organization.
- To review the experience design and content elements in the current implementation and evaluate how they stacked up against those of other SaaS / ecommerce businesses.
The Challenge or Opportunity
Gathering information from customers let us visualize the customer journey and helped our organization build a shared framework for discussing the current customer experience. Plus, assessing where our offerings stacked up against those of competitors gave us valuable insights that helped us to build our organizational roadmap.
When assessing your competitors, include both businesses that are in your market, as well as others that might be relevant because they are more forward thinking or have a similar service or delivery model.
I initiated and facilitated a best-practice competitive benchmarking analysis. My goal was to create a coherent picture of the current market for customer experience design (CXD) for SaaS delivery of products. Product owners collaboratively identified specific areas of inquiry, with recommendations from our CXD team. We identified similar industries and SaaS businesses to analyze in specific CXD categories. This provided a current baseline of positives and negatives—what to continue doing and what to change or stop doing. As I mentioned earlier, the outcome provided input to our organization’s roadmaps for business priorities.
We engaged an external, expert CX researcher to facilitate this effort. This helped build confidence within the organization that the analysis was unbiased. He took the CXD elements—including navigation, information hierarchy, look and feel, and content—and conducted a heuristic evaluation.
It was important to make sure that the companies we chose as competitive benchmarks were targeting similar customers such as small-to-medium businesses (SMBs) and the low end of the mid-market. Our aim was to evaluate their end-to-end experience as well as possible, including discovery, education, plan comparisons, signup, email messaging, and account management. Elements that we evaluated included calls to action (CTAs), lead channels, messaging, and imagery. We also reviewed the value proposition and mobile experience. We chose these elements collectively, as a gestalt, because they were most significant in contributing to customer conversions.
Once the evaluator was confident that he had identified industry best practices, he conducted an in-depth analysis, explaining how and why each targeted experience did or did not achieve them. The evaluator then proposed recommendations based on his analysis and presented the final findings to the business stakeholders. Half way through the evaluation, we collected feedback from customers to validate the evaluation.
The Results and Business Impact
Key outcomes of this effort were that we were able to gather a wealth of insights—from an end-to-end perspective—generate a roadmap of projects, and create a CXD strategy. We decided to refresh our approach to copy—for both the Web site and email messages—rework the site’s navigation, and improve our ecommerce experience.
The site is also a lead-generation sales channel, and the evaluation helped us to understand how to educate people about the value of the product, as well as what sales CTAs would help increase conversion success. We identified potential improvements to our email messaging and account management and refined and enhanced the quality of the site’s visual design—in ways that are specific to a direct response, ecommerce marketing site.
It was super exciting to be able to leverage these findings in our site redesign. We addressed our messaging and content strategy to better support customer needs, speak from the customers’ point of view, and improve search-engine optimization (SEO). We up-leveled our CXD strategy, using the research findings to redesign workflows, user interactions, and visual identity, improving engagement, usability, and the quality of the customer experience. Plus, business owners reviewed the results that were relevant to their responsibilities and used them to identify roadmap and project priorities that aligned with business goals. Examples of outcomes that we could express as metrics included a 12% reduction in bounce rates and almost a 3X increase in traffic as a result of the improvements to content-driven pages and SEO that I mentioned earlier.
The results of our best-practices benchmarking went far beyond the value of the data that we gathered—even though the data was very important and we leveraged it throughout our organization. Benchmarking was yet another milestone in embedding a CXD strategy into our organization, in defining the role the CX designer played in supporting our business goals. This had a direct impact on our ability to improve the methods of the design, content, business, and development teams. Through increased dialogue among these teams and the presentations my team gave, we were able to socialize the results of the research across CXD teams throughout the company. This effort increased our visibility as key champions of the user experience and built our brand and reputation as respected as CXD professionals within the company.
Best Practice #4: Online, Unmoderated Usability Testing
Through our customer-journey, moderated-testing, and benchmarking best practices, we obtained a wealth of important information that we could leverage over several projects. But it was equally critical that we continue to use multiple approaches in gathering customer feedback, as appropriate to the project context. Unmoderated usability testing is a low-cost, yet valuable method of gathering feedback in a short timeframe, and no facilitation of tasks and questionnaires is necessary.
The Challenge or Opportunity
A key soft skill for CXD professionals is adaptability. We need to assess each project or situation contextually. What resources are available? What is the time line? How can we get the best data to address stakeholder needs and business goals? Sometimes both the amount of information we need and how quickly we need it dictate the right research method to choose. At one point in our Web site–redesign project, I realized we needed customer feedback to help make informed decisions, but we didn’t have much time to obtain it.
There have been incredible advances in the online testing tools that are available. The online, unmoderated approach to usability testing let us automate the collection of qualitative and quantitative feedback on our Web sites and mobile apps. Participants can be in various locations and participate asynchronously from their own computer or device. We usually targeted 50 to 100 participants. Another clear advantage of online, unmoderated testing is that you can complete and report on the research quickly, usually within one week. Since you can record each participant’s responses, all stakeholders can review the data. Basing your discussions on this data reduces churn and facilitates teams’ making data-driven, customer-centric decisions.
The ultimate goal for this inquiry is to get to know your customers. Reaffirming the customer profile for your target audience is critical to building and maintaining confidence in the results of your research. Therefore, your test plan should include a survey that identifies the business characteristics of the target population. This test plan should include the following:
- profile of your target population, including the number of participants
- what you’re evaluating
- what metrics you’re gathering
Many participants can simultaneously participate in the study—from their own computer or device, in their own environment. All participants have the same goals and complete the same set of tasks as the software guides them through their test session. The software tracks and collects data in real time—including success ratio, time on task, clickstreams, and heatmaps—then generates a report on the data that you can share it with the entire organization.
The Results and Business Outcomes
By building on the test results and user profiles from our previous research, we were able to continue fleshing out the bigger picture of both the users and the Web site. The information we gathered from the research contributed to our redesign of the information architecture, the homepage, and the entire site. We were able to craft an experience design and content strategy for the site that aligned with the overall company’s content and brand strategy. This enabled stakeholders to quickly review the data, get aligned, and provide direction on next steps, including
- a new structure for the site’s primary navigation
- a new product-page structure
- removing unused pages
- repurposing other pages by applying current best practices
As we built our CX research skills within the organization, this improved our ability to make design decisions and significantly improve the design of the Web site within a short time period. We are now a data-driven design organization and base our design direction and final decisions on participant feedback.
Over the last few months, business owners across the organization have proactively requested more CX research. The information it provides lets us provide customer-centric copy and design solutions—both for demand generation and ecommerce. Thus, the results of our research have increased consistency across customer touchpoints throughout the organization. It also continues to educate our business partners on the value of getting to know our customers—what is important to them and how we can solve their business problems—which, in turn, lets us target our marketing efforts more accurately to our customer population.
Building CX research expertise and demonstrating its value to the business is critical to developing a CXD strategy. The research activities I’ve described in this article not only provide customer-centric data, they also help to build a common vocabulary and mental model across the organization. The other necessary component of a CXD strategy addresses the internal processes of the organization.
Evangelizing CX research and building an organization’s perception of the value of CX research and design expertise is critical to developing a CXD strategy. In the first two parts of this article, I’ve focused on customer-research methods that you can use to gather data that contributes to the success of both short-term and long-term customer experience projects. Over time, the data that you gather provides an archive of information that you can mine for insights.
However, these research methods are only part of the story. They define the what and how of your CXD strategy. Another critical part of creating an effective CXD strategy focuses on
- helping the members of a CXD team to learn why they should advocate for any particular research method
- demonstrating the transformation of the data from your research findings into information and insights, which contributes to teams’ making data-driven design decisions
The better educated and more articulate CXD team members are and the better they become at initiating customer-centric conversations, the greater their influence in building a customer-centric organization.
In Part 3, the final part of this series of articles, I’ll share how my team has enabled what I call right-sized processes and describe the impact they have had on our organization. I’ll also share my perspective on how to build and grow a team of expert, high-functioning CXD professionals.