Placing Value on User Assistance

By Mike Hughes

Published: March 24, 2008

“User assistance writers are often the Rodney Dangerfields of the UX world, bemoaning the fact that we don’t get any respect.”

User assistance writers are often the Rodney Dangerfields of the UX world, bemoaning the fact that we don’t get any respect. I think the real problem is that user assistance folks are not particularly good at communicating the ways in which we add value to an enterprise. This column explores two models that show how user assistance adds value and how we can communicate that value to those who pay our salaries—something I would like to encourage other user assistance writers to do.

The Value Triangle

Many service-oriented enterprises define a three-way value relationship that entails the participation of an agent, a client, and a sponsor.

  • The agent is the person or company who performs the service.
  • The client is the person or company who receives and benefits directly from the service.
  • The sponsor is the organization that employs the agent or pays for the service.

Industries that follow this model include education, training, and health care. For example, in the health industry, a nurse is an agent, a patient is a client, and a hospital is a sponsor. User assistance can apply a similar model, but in this case, the stakeholders are the writer, the user, and the business.

Figure 1—The value triangle

The value triangle

As Figure 1 shows, the writer directly adds value to both the business and users. Writers can have a direct impact on business performance by reducing the costs of information development, production, distribution, translation, and reuse. Writers also have a direct impact on users by improving their experience with a product—namely by providing information that facilitates the user’s getting value from the product. Thus, user assistance plays a key role in adding user value.

“User assistance plays a key role in adding user value.”

But a writer also makes an indirect contribution to a business that we often overlook—or do not articulate well—namely, the value that a better-informed, better-performing user adds to a sponsor’s business case. To our credit, writers have evolved from merely saying “We produce well-written, correctly punctuated documentation” to the stronger value proposition that “We support user’s task-centric information requirements.” In other words, we’ve done a good job of moving our value proposition from documenting products to supporting what users do with those products. However, what we’ve done less well is to grasp and communicate the indirect contribution we make to our sponsors—that is, how a better-informed, better-performing user benefits our employers.

Kirkpatrick’s Four Levels of Evaluation

Another useful model comes from the field of instructional technology: Kirkpatrick’s Four Levels of Evaluation. Although Kirkpatrick originally formulated this model for training, it has direct application to user assistance as well. According to Kirkpatrick, we can assess instruction—and user assistance is essentially instruction—at Four Levels:

  • Level 1: Reaction
  • Level 2: Learning
  • Level 3: Transfer
  • Level 4: Results

Since this model’s origins are in training, I will first describe how the Four Levels apply in the context of a training course on operating a drill press. Let’s assume that excessive scrap rates coming out of a machine shop had made management aware of the need for training.

  • Reaction—How did the students react to the training itself? We usually assess this through a course evaluation sheet—for example, the course met my expectations, the instructor was knowledgeable, and so forth.
  • Learning—Did the students learn anything? We can assess this through testing or lab observations. The instructor can observe the students and certify them using a checklist of targeted drill press competencies.
  • Transfer—Did the student go back to the job and apply the new knowledge or skills correctly and effectively? Shop supervisors can observe their employees after the training to see whether they apply the right techniques.
  • Results—Did the training solve the business problem that triggered it? Did the scrap rates go down?

Now, let’s see how we can apply Kirkpatrick’s model to user assistance:

  • Level 1: Reaction—We see this level in reader response cards or links that ask “Was this information useful?” We also see it in usability tests in which participants rate various aspects of a product or document. Lots of the research on typography or layout stop at this level of evaluation, asking “Which document looks more professional?”
  • Level 2: Learning—This translates to: When users read the user assistance, can they understand it and apply it to the task at hand? For example, did the quick start card work in the lab when we specifically asked users to use it?
  • Level 3: Transfer—This is the tough one. Did users improve their performance in real life, because of the user assistance? For example, did real patients comply better with their medication protocols when they received redesigned instructions?
  • Level 4: Results—Did we achieve the business goal we intended the document to address? For example, did support calls go down, did medical claims decrease, did user registrations increase on a redesigned Web site, did the percentage of transactions completed go up, and so forth?

I think we need to emphasize Levels 3 and 4 more. For example, I’ve never come across a research study on fonts that tested whether users completed tasks faster or made fewer errors depending on which font was used for the Help text—so why do we fight so passionately about it? I would like to see academic research increase its emphasis on user performance—Level 3.

I would also like to see more discussion about how better-informed, better-performing users make positive impacts on an organization’s business outcomes—Level 4. In the next section of this column, I’ll discuss where I believe user assistance makes substantial contributions in this area.

The Benefits of the Better-Informed, Better-Performing User

“User assistance can have a tremendous impact on user adoption and market penetration.”

User assistance can have a tremendous impact on user adoption and market penetration. Many companies that rely on users installing their products experience significant levels of Returned Material Authorizations (RMAs)—products users return, because they do not work. However, in many instances, analysis shows there is nothing wrong with the products. Quite simply, the user was not able to install and configure the product successfully—to the point where it became functional for them. Writers who work on quick start cards and configuration guides should pay attention to this easily tracked indicator.

Web site conversion rates are another easily tracked indicator of user adoption. Conversion usually means the user has registered to use the site, adopt a feature, or receive communications from the sponsor. Technical communicators can increase user adoption by streamlining registration and providing more effective messaging before and during the registration process.

There is a tendency among technical communicators to back away from what they perceive to be marketing-oriented communications. I myself have often advocated for the point of view that such communications get in the way of the user experience. But a recent experience broadened my perspective. I was working on a set of product configuration screens and the product manager wanted a summary page at the end that communicated the value of the security modules that the user had activated. I was ready to voice my reluctance when the product manager explained that he was concerned about a specific set of users: product reviewers. It seems that product reviews in trade magazines are quite common and influential in our industry. Well, if market penetration and user acceptance are ways I can add value as a user assistance writer, why not take this class of user into account? Granted, I have to overcome the challenge of how to meet that requirement without looking and sounding like a billboard, but that’s why I’m a professional writer, and that’s my challenge to solve.

Another way that a better-informed, better-performing user adds directly to the business bottom line is through reduced support costs. But achieving that forces user assistance writers to focus more on real stumbling blocks to user performance and satisfaction and less on documenting obvious user interactions. User assistance that focuses on expert guidance for tuning parameters and troubleshooting is more likely to add value in this area than user assistance that tells users they need to type their user ID in the user ID field and their password in the password field.

A third way better-informed, better-performing users benefit a business is that such users are more likely to derive greater value from a product and, therefore, have higher satisfaction levels. This translates into extended business opportunities within the installed base and new opportunities as a result of recommendations and reputation in the marketplace.

Conclusion

“Communicate your high-level value contributions to the business.”

When assessing and communicating your value to your company, don’t focus on how you contribute lower-level value—reaction and learning. Seek guidance that helps you understand the business objectives for your product, then look for ways the product’s user assistance can support those objectives. Finally, communicate your high-level value contributions to the business. Given the choice between bragging about having an award-winning Help file or getting a favorable product review in a trade journal, I’d take the positive review any day.

1 Comment

Writers also provide value by test-driving the product before it gets to the user. Writing API docs, I find this is especially important. In my experience, APIs are driven through automated tests, but there’s little or no ad hoc testing. By documenting the API with her own code samples, the writer actually tests the process of using the API. This can find issues such as bad naming—I found the keyword saperator once—inaccessible data, or even faulty design. Test driving the product with an eye for user advocacy definitely adds value to the product and can even save developers time and energy.

Join the Discussion

Asterisks (*) indicate required information.