In Part 1 of this series on how to design for mobile touchscreens, I told you all about the history of touchscreens, how capacitive touch works today, and the research I have been conducting to find out how people really interact with their touchscreen phones and tablets.
In Part 2, I discussed the first five of my ten heuristics for designing for touch in the real world, on any device:
1. Device diversity is human diversity.
Design for every user and every type and size of phone.
Design for the variable ways in which people work with their devices, not just one way.
Set aside your biases. Don’t assume everyone has your phone or uses it the same way you do.
2. People touch the center of the screen.
Users prefer to touch the center of the screen and will do so whenever you give them that choice.
Place key actions in the middle half to two-thirds of the screen. Then, place options and buttons that provide paths to secondary screens along the top and bottom of the screen.
3. People look at the center of the screen.
Make sure key content is in the middle of the page, whether for tapping or just viewing.
Sometimes, this means simply allowing content to scroll to the middle. So provide extra space at the bottom of a page to let users scroll the last content to the middle of the viewport.
4. Fingers get in the way.
Make sure people’s fingers and thumbs don’t obscure content, so they can see what they’re tapping.
Make sure selectable items are large enough that they can clearly indicate a successful tap. Try to place functions or any content that changes state to allow users to see the result. Place functions in such a way as to invite users to perform the actions you think are important.
Ensure there is plenty of whitespace within the content, as well as reasonable margins, so users can feel confident about scrolling or gesturing in an area that’s separate from any selectable content.
5. People use different devices in different ways.
Support all input types—especially if you are building responsive Web sites or expect to create an app for tablets and mobile phones.
If you can, gather data on how your users work in their actual environment. However, for most users, the patterns I’ve outlined here are pretty safe for you to follow. You can predict the right type size based on usage and device class.
For optimal readability, adjust the sizes of type, icons, text boxes, checkboxes, and buttons to accommodate the typical distance of the screen from the user’s eyes for a particular device class.
Now, in Part 3, the last part in this series, I’ll cover the remaining five heuristics:
Touch is imprecise.
One, two, three for better mobile design.
People tap only what they see.
Phones are not flat.
Work at human scale.
6. Touch Is Imprecise
Accuracy is not hit or miss, but is always relative. That’s okay, because you can easily take this into account during design, as long as you consider the impreciseness of touch right from the start. Figure 1 shows an example of touch accuracy for a specific element. The circle represents the R95 Circular Error of Probability, or the radius containing 95% of all taps.
When everything is imprecise, stop thinking about errors and focus on tolerances instead. Plan for imprecision and the problems it causes during the design process by providing the largest practical touch targets. I’ll give you some minimum-size references later in this column, but you’ll rarely need them if you always make sure to code the largest possible target. Rather than just coding any word or icon as a link, use the natural boundaries in your design—for example, boxes, buttons, and entire rows—and make these entire areas tappable. Then, if people are a little off, but tap anywhere near the icon or text label, they’ll still hit the target. This is trivially easy to implement. Designers should be sure to specify this in their design deliverables. Developers should assume that the target is the whole area, unless someone specifically tells them not to do so.
Remember, real users work with touch user interfaces in the real world.
Make touch targets as large as possible, making entire containers such as rows, boxes, and buttons targets, not just icons or words.
Don’t just design in the little details or retrofit touch design onto an interface. Make your designs touch centric at the grid and template level, providing enough room and the right kind of interactivity.
7. One, Two, Three for Better Mobile Design
Touch isn’t just inaccurate, it’s apparently inconsistently inaccurate. But I’ve found that the largest variable is not the user’s age, education, familiarity with touchscreens, or anything else designers and developers might normally expect, look for in usability testing, or ask me about. Instead, it is simply the area on the screen the user is trying to tap. Figure 2 shows a chart of touch accuracy for specific parts of the screen that I’ve compiled from all the research I have conducted and analyzed.
Millions of measured touches, on many different touchscreen phones and tablets, and on every major operating system, indicate that people’s touch interactions are faster, more confident, and more accurate as they approach the center of the screen.
Accuracy varies from 7 mm at the center of the screen to 12 mm at the corners. Think about the whole context though. For lists of options or icons that attract touches along the edges of the screen, make sure rows are at least 9 or 10 mm in height so users can accurately tap anywhere along the row.
And remember the inaccuracy lesson. Some people will still miss touch targets at these levels of accuracy. If interference between touch targets is likely, design to avoid catastrophic actions. For example, never place email formatting controls right next to the Send button. A send is unrecoverable. Make sure the buttons that trigger unrecoverable actions are not very near to anything else the user routinely uses or might easily hit by accident.
Taking all of this into account, along with the view preferences I discussed previously, we can start to create a whole design system, as discussed in my last column. Place secondary actions along the top and bottom edges of the screen. Tabs along the top or bottom edge of the content area let users switch views or sections. Action buttons let users compose content or search. Hide tertiary functions on menus that users can open by tapping a target in one of the corners of the screen. Figure 3 summarizes this hierarchy of information design.
Always consider how well people can touch specific parts of the screen. Within each zone on the screen, use the type sizes for each device class that I provided in Table 1, in Part 2 of this series, ensuring adequate spacing between tappable items to prevent interference. Rows and grids of content in the middle of the screen can be quite dense, but tabs or action strips at the top or bottom of the screen should contain very few items.
Design by zones, spacing tappable items to prevent interference according to how well people can touch certain parts of the screen.
Touch accuracy differs depending on the area of the screen. Touching the sides of the screen is less accurate than touching the center, so for lists, either avoid placing actions such as delete or select along the left or right side of the screen or make each item in the list tall enough to accommodate edge tap targets.
When you place controls along the top or bottom of the screen, provide as few items as possible and give them adequate space. The Android Action Button spacing is too tight, so loosen it up. Placing more than four items on an iOS menu bar—or five on the largest phones—is just asking for trouble.
Remember to plan for interference, and place items that either trigger dangerous, unrecoverable actions or display annoying, hard-to-exit elements far from other items or provide the ability to undo these actions.
8. People Tap Only What They See
Clients still ask me for gesture-only user interfaces or secret interactions that add a sense of delight and discovery, but I am firmly on the side of boring interfaces. Start with what works—simple controls that work in expected ways. The most expected controls are those that are visible and communicate what they will do.
Make sure that selectable items are clearly selectable because, if an item doesn’t look tappable, people won’t know it is.
For inline links within the content on Web pages, underlines are fine. However, for applications—whether mobile or Web interfaces—you should usually bound, or box, tappable items. A bounding box can be integral to an element—for example, placing an icon in a circle, an item in a container, or simply using list or table cells.
Tappable items need not only to afford their action—making it clear what they do—but also to do so consistently. Without a really good reason, don’t underline some items on a page, while making other items buttons and selectable rows. If items do similar things, every one of them should look and act the same way across the application.
I generally design apps to have three basic types of interactive items for particular types of actions:
selectable rows with arrows to the right—These items load other pages or more details.
underlined text—These items load additional, helpful information inline.
buttons—These submit data or change state.
Note—As I described in Heuristic 4, a whole row, paragraph, or button should be tappable, never just the words or label alone.
Of course, there are many other interactive elements such as media controllers and form inputs, but these three cover about 95% of all interactions.
Visual targets—whether they are text, icons, shapes, or any other type of user-interface widget—must do the following:
Attract the user’s eye.
Be represented visually so the user understands that they are tappable elements.
Be legible, so the user understands what action they will perform.
Be large and clear enough so the user can easily and confidently tap them. Follow the size rules for each device class that I provided in Table 1, in Part 2 of this series.
9. Phones Are Not Flat
People shift the way they hold and touch their devices depending on their context. For example, I have observed people changing the way they’re holding their phone when
opening a door
carrying a baby
walking down the street
walking in difficult terrain or stepping off a curb
riding on a train or bus, especially when standing
in a dangerous context—such as when walking in the wind, near water, or near a drop-off
People adapt to their situations—but they do suffer consequences. As shown on the right in Figure 4, simply carrying a bag in one hand and using the phone with the other can greatly reduce touch accuracy in the most distant corner of the screen—to over 30mm. Figure 4 shows the differences in accuracy for nine touch targets when a user is holding the phone in two hands, on the left; versus one hand, while carrying a bag, on the right. That’s a whole lot. Keep this in mind when thinking about the consequences of the user’s missing a target.
Plus, don’t forget that phones are not actually flat. The raised bezels on some phones—or the similar effect that results from the cases people often use for their phones—make it difficult to tap targets at the edge of the screen. Phones with edges that curve away from the flat part of a screen cause much the same effect. Samsung has applied their Edge phone design to their main product line, so the screen slowly curves away from the user’s eye and the contact plane. What this means is that many users won’t actually be able to get to the very edge of their screen. Whether because of screen curvature or a raised bezel or case, if users’ finger cannot touch the very sides of the screen, edge functions or gestures may fail or be hard to activate.
People interact with their phone in real environments. We can’t think that people are just interacting with a flat, glass screen.
Avoid getting too close to the boundaries of any touch-target size requirements. Think about worst cases as well.
Plan for different contexts, and try out your designs in difficult environments to make sure they work well in the real world.
If you need to place items right against the edges of the screen or use edge gestures, go ahead. But provide some padding, so they’ll work even with cases and curved screens.
10. Work at Human Scale
The best way to handle sizes on mobile devices is to keep your actual hands on your designs. Sketch on paper, using stencils at the scale of actual phones and tablets.
Take your higher-fidelity designs out of OmniGraffle, Visio, Axure, Photoshop, InDesign, or whatever, and get them onto the phone. Whether you are sketching, drawing in high fidelity, or using a prototyping tool, view your designs in a photo gallery or another mobile app. There is no replacement for holding your designs in your hand.
Visit or simulate real environments to see whether user interfaces work in their contexts of use. Stores, hospitals, workshops, and gardens are not at all like your cubicle at work. Go outside and see how your designs work in the sun or while walking. Pass your phone around to other people. Don’t just seek approvals for designs in PowerPoint presentations. Instead, make everyone look at the designs on the phone you’ve also brought along to the meeting.
When it’s time to measure and confirm that text, icon, and other touch-target sizes are correct, don’t just do the math. Instead, measure targets on the actual device to make sure you’ve got them right.
I have to end this discussion with a small pitfall: There are scale issues involving the way device-independent pixels and viewport scaling work. The makers of mobile-phone operating systems have tried to simplify everything for developers by grouping screen resolutions into a small set of different categories. But the variability in the physical sizes of devices and screen pixels means your expected size is not what you actually get. For example, you could make a button 7 mm tall, but find that, once scaled for some phones, it is either a tiny 5 mm or a huge 9 mm in height. And you cannot tell which it will be.
Get off your computer and try out your designs on real devices, as soon and as often as possible.
Get away from using PowerPoint presentations and, instead, display your designs on real devices to get approval.
To account for under-sized scaling, bad environments, and users with poor eyesight, always design everything about 20% larger than you otherwise would.
All of this seems like a lot to keep track of, but it’s not so bad if you just remember the core features of these ten heuristics:
Place content and functions in the middle of the screen.
Design with fingers in mind, so users can see around them.
Design for zones, with bigger touch targets along the edges and at the corners of the screen.
Place touch targets to avoid catastrophes when people miss them.
Be consistent in how you design user interfaces and interactions.
Always respect people and their device choices, patterns of use, and contexts.
Design mobile interfaces at their actual sizes.
Try using your designs on real phones.
Remember to design for hands, fingers, thumbs, and people.
Always back up your design decisions with data. Without data, you’re just another person with an opinion. All of the information I’ve given you in this series is based on extensive observations of users, often on direct measurements, and research that lets me understand the reasons behind these core concepts and the importance of designing with them in mind.
But I don’t know everything there is to know about touch. So keep asking questions, and do your own research when you run into odd behaviors that you cannot explain. We need to know a lot more about context and model why and when people change their grips and how they perform certain actions so we can customize mobile user interfaces even more.
We need to know about how other types of touch devices work. There is some academic research on back-of-phone touch interfaces, but none to speak of about wearables—despite my best efforts to get some makers interested in conducting such research.
No matter what comes next, the usage of touchscreen devices will continue to be ever more pervasive, and there’s still a lot more to learn, so we can improve them.
Bérard, François. “Measuring the Linear and Rotational User Precision in Touch Pointing.” Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces. New York: ACM, 2012. A detailed, well-designed exploration of separating physical pointing limits from other aspects of touch. It turns out that people can control their finger positions to an accuracy of around 0.1 mm. However, except in cases where a user interface provides enhancements—such as the edit magnifier in iOS—this is irrelevant, and we must consider all aspects of perception that result in pointing accuracy.
Bergstrom-Lehtovirta, Joanna, and Antti Oulasvirta. “Modeling the Functional Area of the Thumb on Mobile Touchscreen Surfaces.” Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI ’14). New York: ACM, May 2014.
Bragdon, Andrew, et al. “Gesture Play: Motivating Online Gesture Learning with Fun, Positive Reinforcement and Physical Metaphors.” Proceedings of ITS ’10 ACM International Conference on Interactive Tabletops and Surfaces. Saarbrücken, Germany: ACM, 2010. Bragdon is one of the few researchers who has really carefully, scientifically explored gestures. This research is still foundational, so we can currently draw few conclusions that are applicable to design, but it’s interesting stuff.
Clark, Josh. Designing for Touch. New York: A Book Apart, 2015.
Gilbert, Juan E., Aqueasha M. Martin, Gregory Rogers, Jerome McClendon, and Josh Ekandem. “Hey, That’s Not Who I Voted For! A Study on Touchscreen Ballot Design.” Interactions, November 2012. Retrieved March 28, 2017. This article conflates target and interference, and all tested designs have buttons that are hard against each other—so watch out for that. However, it includes interesting information on indicator and visual-target focus and discusses labels attracting taps and checkmarks to indicate selection.
Henze, Niels, Enrico Rukzio, and Susanne Boll. “100,000,000 Taps: Analysis and Improvement of Touch Performance in the Large.” Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services. New York: ACM, 2011. This is my favorite research on touch that I didn’t do myself. They got several million somewhat-qualified clicks from a game on Android, so the research occurred outside the lab and has insane scale. “Below 15 mm, the error rate dramatically increases and jumps to over 40% for targets smaller than 8 mm.” There are nice charts on error rates by position, showing that the edges of the screen are much worse, pretty symmetrically.
Lee, Seungyon, and Shumin Zhai. “The Performance of Touch Screen Soft Buttons.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: ACM, 2009. 10-millimeter buttons resulted in 98 to 99.4% accuracy. Reducing button size caused users to switch from thumb to finger. iPhone keyboard buttons resulted in high accuracy (90%), but a typing speed that was a third slower; plus, there was a huge jump in the corrections that were necessary when typing on the narrow keyboard. I think they measured accuracy as the ability to enter the right value, not the user’s ability to hit the right target the first time, and therefore, missed critical interference issues.
Ng, Alexander, Stephen A. Brewster, and John H. Williamson. “Investigating the Effects of Encumbrance on One- and Two-Handed Interactions with Mobile Devices.” Proceedings of the 32nd Annual ACM Conference on Human Factors in Computing Systems (CHI ’14). New York: ACM, May 2014. This is where I got the very interesting information on how much touch is affected by holding a bag in the other hand.
Parhi, Pekka, Amy K. Karlson, and Benjamin B. Bederson. “Target Size Study for One-handed Thumb Use on Small Touchscreen Devices.” Proceedings of the 8th Conference on Human-Computer Interaction with Mobile Devices and Services. New York: ACM, 2006. Clear research findings showing that interference sizes are smallest at the center of the screen and notably worse at the corners. If you have access to the paper, its Figure 6 provides a simple explanation. “[The] target size of 9.2 mm for discrete tasks and targets of 9.6 mm for serial tasks should be sufficiently large for one-handed thumb use on touchscreen-based handhelds.” They used CEP-R95, and there was no improvement when increasing target size to 11.5 mm. This paper conflates target size and interference for the most part, very often putting test items adjacent to each other. In Mobile Usability, quoted by Nielsen as recommending a target size of around 10 mm, which I can agree with.
Park, Yong S., Sung H. Han, Jaehyun Park, and Youngseok Cho. “Touch Key Design for Target Selection on a Mobile Phone.” Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services. New York: ACM, 2008. Really interesting research because participants cradled the device and used their thumb to interact and the results plotted the accuracy rate by screen location. Size recommendations resulted from this study as well—specifically, that 10 millimeters is better than 7 millimeters, and 4 millimeters is sort of a disaster. The authors made special note that the bottom of the screen was hard to tap accurately.
Perry, Keith B., and Juan P. Hourcade. “Evaluating One-Handed Thumb Tapping on Mobile Touchscreen Devices.” Proceedings of Graphics Interface 2008. New York: ACM, 2008. Evaluated users both walking and standing, and all held the device with one hand and tapped with the thumb.
Plait, Phil. “Resolving the iPhone Resolution.” Discover, October 17, 2014. Retrieved March 28, 2017. In the guise of adding science to the hype around the original release of the Apple Retina display on iPhone, Phil Plait, in his Bad Astronomy column, published an easy-to-read discussion of the math behind angular resolution and why it matters more than absolute size.
Schildbach, Bastian, and Enrico Rukzio “Investigating Selection and Reading Performance on a Mobile Phone While Walking.” Proceedings of the 12th International Conference on Human Computer Interaction with Mobile Devices and Services. New York: ACM, 2010. “Whilst performance decreases, cognitive load increases significantly when reading and selecting targets when walking. Furthermore, the results show that the negative effect regarding target selection can be compensated by increasing the target size, but the text reading task did not yield better performance results for a larger text size due to the increased demand for scrolling.” Here are this study’s results: 6.74-millimeter buttons had an error rate of up to 23%; buttons of about 10 millimeters were much better; and 20-millimeter buttons solve almost all problems, even when the user is walking. Small buttons make people slow down due to cognitive loading. While small, 6-point text results in no performance degradation when the user is walking, some users perceive it as being worse. There is a middle ground due to the need to scroll. Text sizes that are too large are also bad.
Sesto, Mary E., Curtis B. Irwin, Karen B. Chen, Amrish O. Chourasia, and Douglas A. Wiegmann. “Effect of Touch Screen Button Size and Spacing on Touch Characteristics of Users With and Without Disabilities.” Human Factors: The Journal of the Human Factors and Ergonomics Society, June 2012, Volume 54, Number 3. The study found that 20-millimeter buttons worked best and larger spacing between them didn’t help, but they measured the force used, not targeting accuracy—apparently because of a special need of the disabled cohort. While these findings are not directly applicable, the 20-millimeter size is reasonable when accounting for inaccuracy due to motor-function issues.
Xu, Wenchang, et. al. “Digging Unintentional Displacement for One-handed Thumb Use on Touchscreen-Based Mobile Devices.” Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services. New York: ACM, 2012. Detailed discussion of the inaccuracies that liftoff—removing your finger from the screen—induces on mobile handsets, with some exploration of the biomechanics that are involved and how those inaccuracies change according to the position on the screen. Interesting notes about changes in accuracy over time, which will inform any gesture standards that may eventually emerge.
For all of his 15-year design career, Steven has been documenting design process. He started designing for mobile full time in 2007 when he joined Little Springs Design. Steven’s work includes Designing by Drawing: A Practical Guide to Creating Usable Interactive Design, the O’Reilly book Designing Mobile Interfaces, and an extensive Web site providing mobile design resources to support his book. Steven has led projects on security, account management, content distribution, and communications services for numerous products, in domains ranging from construction supplies to hospital record-keeping. His mobile work has included the design of browsers, ereaders, search, Near Field Communication (NFC), mobile banking, data communications, location services, and operating system overlays. Steven spent eight years with the US mobile operator Sprint and has also worked with AT&T, Qualcomm, Samsung, Skyfire, Bitstream, VivoTech, The Weather Channel, Bank Midwest, IGLTA, Lowe’s, and Hallmark Cards. He is currently User Experience Architect with diesel engine maker Cummins, in addition to running his own interactive design studio at 4ourth Mobile. Read More