Now, in Part 2, I’ll offer two additional best practices that balance long-term and short-term benefits. I’ll describe two more research activities that provided data to support our design decisions, increased the organization’s shared, customer centric–mental model, and established the credibility of our team as experience design professionals.
Best Practice #3: Competitive Benchmarking
Competitive benchmarking lets you evaluate how a product or service stacks up against its current competitors. Depending on the goals for the benchmarking study, the companies you compare can be in either the same or a similar industry. In our case, we focused on current best practices for SaaS / ecommerce sites. Our purpose was twofold:
- To educate the organization on what elements are critical to a successful, competitive ecommerce business—We created a glossary that documented what became a shared vocabulary for the organization.
- To review the experience design and content elements in the current implementation and evaluate how they stacked up against those of other SaaS / ecommerce businesses.
The Challenge or Opportunity
Gathering information from customers let us visualize the customer journey and helped our organization build a shared framework for discussing the current customer experience. Plus, assessing where our offerings stacked up against those of competitors gave us valuable insights that helped us to build our organizational roadmap.
When assessing your competitors, include both businesses that are in your market, as well as others that might be relevant because they are more forward thinking or have a similar service or delivery model.
I initiated and facilitated a best-practice competitive benchmarking analysis. My goal was to create a coherent picture of the current market for customer experience design (CXD) for SaaS delivery of products. Product owners collaboratively identified specific areas of inquiry, with recommendations from our CXD team. We identified similar industries and SaaS businesses to analyze in specific CXD categories. This provided a current baseline of positives and negatives—what to continue doing and what to change or stop doing. As I mentioned earlier, the outcome provided input to our organization’s roadmaps for business priorities.
We engaged an external, expert CX researcher to facilitate this effort. This helped build confidence within the organization that the analysis was unbiased. He took the CXD elements—including navigation, information hierarchy, look and feel, and content—and conducted a heuristic evaluation.
It was important to make sure that the companies we chose as competitive benchmarks were targeting similar customers such as small-to-medium businesses (SMBs) and the low end of the mid-market. Our aim was to evaluate their end-to-end experience as well as possible, including discovery, education, plan comparisons, signup, email messaging, and account management. Elements that we evaluated included calls to action (CTAs), lead channels, messaging, and imagery. We also reviewed the value proposition and mobile experience. We chose these elements collectively, as a gestalt, because they were most significant in contributing to customer conversions.
Once the evaluator was confident that he had identified industry best practices, he conducted an in-depth analysis, explaining how and why each targeted experience did or did not achieve them. The evaluator then proposed recommendations based on his analysis and presented the final findings to the business stakeholders. Half way through the evaluation, we collected feedback from customers to validate the evaluation.
The Results and Business Impact
Key outcomes of this effort were that we were able to gather a wealth of insights—from an end-to-end perspective—generate a roadmap of projects, and create a CXD strategy. We decided to refresh our approach to copy—for both the Web site and email messages—rework the site’s navigation, and improve our ecommerce experience.
The site is also a lead-generation sales channel, and the evaluation helped us to understand how to educate people about the value of the product, as well as what sales CTAs would help increase conversion success. We identified potential improvements to our email messaging and account management and refined and enhanced the quality of the site’s visual design—in ways that are specific to a direct response, ecommerce marketing site.
It was super exciting to be able to leverage these findings in our site redesign. We addressed our messaging and content strategy to better support customer needs, speak from the customers’ point of view, and improve search-engine optimization (SEO). We up-leveled our CXD strategy, using the research findings to redesign workflows, user interactions, and visual identity, improving engagement, usability, and the quality of the customer experience. Plus, business owners reviewed the results that were relevant to their responsibilities and used them to identify roadmap and project priorities that aligned with business goals. Examples of outcomes that we could express as metrics included a 12% reduction in bounce rates and almost a 3X increase in traffic as a result of the improvements to content-driven pages and SEO that I mentioned earlier.
The results of our best-practices benchmarking went far beyond the value of the data that we gathered—even though the data was very important and we leveraged it throughout our organization. Benchmarking was yet another milestone in embedding a CXD strategy into our organization, in defining the role the CX designer played in supporting our business goals. This had a direct impact on our ability to improve the methods of the design, content, business, and development teams. Through increased dialogue among these teams and the presentations my team gave, we were able to socialize the results of the research across CXD teams throughout the company. This effort increased our visibility as key champions of the user experience and built our brand and reputation as respected as CXD professionals within the company.