In my last column, I wrote about dark patterns, but this time I want to discuss something that is literally rather than metaphorically dark: inverted polarity–display methods, or dark mode.
I haven’t addressed this as a stand-alone topic before, even though I’ve been doing dark interface design for years. I have a lot of experience designing for dark palettes and have discovered what works, what doesn’t, and have tried to learn why things do or don’t work so I can improve my designs.
But, for many UX designers, dark mode is a new thing because operating systems are now supporting it. In fact, dark mode is now so ubiquitous that it is almost a requirement for many new apps. Plus, it’s even making its way onto the Web. But the usual backlash has started, with some people questioning its value.
So let’s set aside all the rumors, opinions, and hot takes on this design style and, instead, take a look at what it actually means to be in dark mode, why it exists, and what the research on dark mode actually says. Read More
Editor’s note: Since writing this column, Steven has done additional user research and has updated his design guidelines for mobile phones accordingly. Read his latest column on this topic: “Design for Fingers, Touch, and People, Part 1.”
As UX professionals, we all pay a lot of attention to users’ needs. When designing for mobile devices, we’re aware that there are some additional things that we must consider—such as how the context in which users employ their devices changes their interactions or usage patterns.  However, some time ago, I noticed a gap in our understanding: How do people actually carry and hold their mobile devices? These devices are not like computers that sit on people’s tables or desks. Instead, people can use mobile devices when they’re standing, walking, riding a bus, or doing just about anything. Users have to hold a device in a way that lets them view its screen, while providing input.
In the past year or so, there have been many discussions about how users hold their mobile devices—most notably Josh Clark’s.  But I suspect that some of what we’ve been reading may not be on track. First, we see a lot of assumptions—for example, that all people hold mobile devices with one hand because they’re the right size for that—well, at least the iPhone is.  Many of these discussions have assumed that people are all the same and do not adapt to different situations, which is not my experience in any area involving real people—much less with the unexpected ways in which people use mobile devices. Read More
I have a very expansive view of the role of User Experience in developing products. While I’m deeply of the opinion that designers should not code, that’s mostly because there are very few people who can code on many platforms and at many levels. I used to be a Web developer, database administrator (DBA), and system administrator. But I was never great at fulfilling all of these roles—much less all of them at once—while also being a Web designer.
As new technologies arrived, I had to stop and learn them—or learn to collaborate with others who knew them. So, instead of learning more and more technologies, I decided to focus on design and usability.
As UX designers, we should avoid becoming too deeply engaged in any one technology, but we do need to know a little about most technologies. This lets us consider the entire scope of users’ needs and suggest solutions that leverage the whole range of technology options—choosing whatever platforms, technologies, and methods best meet both users’ needs and organizational capabilities. Read More