I recently bought a Toyota Prius and was surprised to notice my driving behavior change to a more economical style of driving. Doing some research, I learned that I wasn’t alone in this. Much has been written about “the Prius Effect”—how the Prius and other hybrid vehicles change driving behavior by providing feedback that shows drivers how their actions affect their gas mileage. Some people view this as a positive effect, while others, who are annoyed by slow Prius drivers, view it negatively.
What causes Prius drivers to change their behavior? I believe that it’s the feedback that the Prius’s Multi-Information Display provides to drivers. This display consists of several screens, showing the current gas mileage, average gas mileage over various periods of time, and whether the gas or electric motor is currently powering the car. In this column, I’ll discuss the Prius’s information displays, in terms of the effects they have on drivers, the usefulness of the information that they provide, and the effectiveness of their design. Read More
Conducting traditional synchronous, or moderated, usability testing requires a moderator to communicate with test participants and observe them during a study—either in person or remotely. Unmoderated, automated, or asynchronous usability testing, as the name implies, occurs remotely, without a moderator. The use of a usability testing tool that automatically gathers the participants’ feedback and records their behavior makes this possible. Such tools typically let participants view a Web site they are testing in a browser, with test tasks and related questions in a separate panel on the screen.
Recently, there has been a surge in the number of tools that are available for conducting unmoderated, remote usability testing—and this surge is changing the usability industry. Whether we want to or not, it forces us to take a closer look at the benefits and drawbacks of unmoderated testing and decide whether we should incorporate it into our usability toolbox. Read More
It’s important for Web site visitors to be able to easily and comfortably find what they’re looking for. A better user experience translates into better conversion rates, as well as happier visitors and customers.
You can use A/B testing tools to support these goals. A/B testing is about putting multiple versions of a page up on a Web site and comparing their performance. How you measure performance is up to you. You can prioritize conversions, downloads, or some measure of user experience such as engagement, time on site, or user satisfaction.
In other words, A/B testing doesn’t have to be just about sales and conversions. As long as you have something to measure, A/B testing is a tactic you can use to support your goals for improving a Web site’s user experience. Read More