Top

Deceptive Patterns: Machine Learning, Habitual Users, and Learned Helplessness

August 22, 2022

Both UX designers and artificial-intelligence (AI) systems use their understanding of human behavior and the needs of users to create better user experiences. Designing and implementing a solution to meet an individual’s needs can enable you to offer a great experience to users, which builds trust and loyalty to your products.

However, some designers and AI systems use their knowledge of human behavior against users by employing deceptive patterns to manipulate them or change their behavior.

Deceptive patterns—also known as dark patterns—are user-interface design solutions that intentionally manipulate or mislead users. Regrettably, they have become more popular with businesses in recent years, just as users are becoming more aware of user experiences. As a UX professional, I thought I knew about all the various kinds of deceptive patterns and their effects on users, but I was wrong. More and more deceptive patterns are revealing themselves every day—and some we cannot escape. In fact, people cannot even see some of them. What are these invisible, deceptive patterns?

Champion Advertisement
Continue Reading…

Deceptive Patterns and Learned Helplessness

In 1967, Martin Seligman and his partner Steven Maier discovered the learned helplessness theory while studying animal behavior. Their research shed new light on trauma. The theory implies that people who have suffered repeated abuse or bad experiences eventually become helpless unless they make significant changes in their life. When such people find themselves in a similar situation, they are unable to change. The findings from this research and its possible implications piqued my interest and made me think about how this might affect our day-to-day behavior, our choices, and our future.

Learned helplessness can also occur as a result of poor UX design. When people experience repeated failures with a task or skill, they could learn to accept that it is beyond their ability to understand or to successfully. Imagine a user interface that presents the user with complex and uncontrollable situations. This could leave users feeling not only defeated but helpless.

For example, let’s say a user wanted to delete her account for a Web application. If account settings were hard to find and, after trying several possible locations where that functionality might be, the user still couldn’t find it, she might simply give up on deleting the account. She might feel that she has no way of achieving her desired action. Amazon provides a good example of how difficult it can be to delete an account. This sort of user experience could result in learned helplessness. This example demonstrates how deceptive design could affect users. As UX professionals, we should avoid subjecting users to such experiences.

The Impact of Deceptive Patterns on Habitual Users

A habitual user is someone who is addicted to using a product on a daily basis. This type of user depends on the product and its rewards. Some games use deceptive patterns that affect users psychologically and cultivate habitual users.

Another good example of a product that cultivates habitual users is Snapchat. In this application, a deceptive pattern occurs when a user loses a reward for missing a day and, thus, has to start over gaining rewards in its reward system. Users keep coming back every day to avoid ending their winning streak. But users sometimes forget why they started using the app in the first place and become more worried about not ending their winning streaks. This effect makes users become addicted to using the app. Some might argue that this is not a misleading pattern, but the truth is that it is. It’s a temporal dark pattern. While this application might seem to be harmless or even beneficial at first glance, over time, when the user sees how long his winning streak is, he’ll conclude that he never wants to miss a day. At this point, the product has become a daily habit and the user is heading for a far-off finish line.

Machine Learning and Deceptive Design

Machine learning drives another deceptive pattern that has piqued my interest. Amazing developments in artificial intelligence and its applications in all fields are today sweeping the world. For example, machine learning is powering social media—one of the most ubiquitous industries today—and is gradually producing more deceptive patterns. The goal of this machine-learning capability is to show users what they think they want—even when they don’t necessarily need it. Let’s look at some examples.

Deceptive Patterns in Sales

In a scenario in which a user is using an application that would require paying for a premium package to enjoy the full experience, the user might be asked several times to pay for that package to get the full experience. Let’s use Medium as an example here.

  • Regular users open the application to find stories or articles to read. They are restricted to reading just three stories a month if they haven’t upgraded their plan.
  • When users find interesting stories and read them, the algorithm keeps feeding them more similar stories. Thus, they quickly exhaust their free stories, and the application recommends upgrading their plan, which they may choose not to do.
  • Once users have exhausted their free trial, they can’t access the content on Medium for some time. But they receive newsletters about stories that might interest them.
  • The application collects data about these users and compares their data to that of other users that have upgraded their plan.
  • The application then creates a new group comprising a smaller set of users who have similar tastes to the user.
  • It also attempts to find converted users who have similar tastes to the user and uses the same strategies to convert the user and other similar users.

The aim of the application design is to get users to upgrade their plan.

Deceptive Patterns in Promotions

I’ve come across yet another deceptive pattern in Google Search. While scanning the search results for interesting stories and scrolling down to view more results, telling the difference between organic search results and paid results—that is, the ads among them—can be tricky.

I once spent ten minutes reading organic search results for stories, very deliberately skipping the ads. But, unbeknown to me, the search algorithm displayed an ad that looked exactly like a search result for a story. Not realizing this, I clicked the ad, then noticed that it had asked me to install an app. I was shocked by this. This ad had fooled me.

The same goes for Instagram ads. The algorithm shows as many ads as possible and watches out for one you might click, then updates your feed and starts showing you more videos or ads that are similar to those in which you’ve shown some interest. At this point, you might continue to watch more videos without noticing the difference and, thus, spend more time on the application. This deceptive behavior is invisible to users. Before they know it, the algorithm is feeding them more and more videos based on their similar interests and changes their behavior by inducing them to keep watching the videos.

This might seem to be a small matter, but this content replaces what the user actually wants to see. Before users know it, their behavior has started to change and instead of pursuing just their initial interest, they’re seeing more and more related content that the algorithm has chosen.

Algorithms sometimes even use this behavioral data to decide whether the techniques they are using to deceive people are proving successful. Then they learn and win by driving behavioral change. The fact that an application wants to steer the user’s behavior toward its goal without their knowing it is a deceptive pattern that can mislead users. Through the use of machine learning, the application changes the user’s behavior.

How to Avoid Falling Victim to Deceptive Patterns

So how can you be sure that you notice such deceptive patterns and avoid them? How can you reduce the risk of falling prey to learned helplessness? There’s little information on this topic, so I’ll share what has worked for me as a user and how I’ve avoided becoming a victim of deceptive patterns.

  • Identify behaviors that reinforce learned helplessness and avoid becoming a victim of them.
  • Evaluate the behavioral changes that occur when you’re using applications that could make you become a habitual user. Don’t allow yourself to use such applications every day. They might begin to affect your mental health.
  • When using machine learning–powered applications, try to gain as much control over them as possible, using whatever options they provide to limit the algorithm’s control. Some of these algorithms let you have control at first, then begin to feed you content using their recommendation system. Don’t follow the algorithms’ recommendations unless they’re recommending content you really want. 

UX Designer

Lagos, Nigeria

Ijeoma OdiakaAs a designer of user interfaces and experiences, Ijeoma has created applications, Web sites, and mobile apps. She uses all her design knowledge to create and communicate effective design solutions to users and make them more enjoyable to use. She is also a writer and an effective visual communicator. Her other core strengths are research strategy, her empathy, and her ability to collaborate.  Read More

Other Articles on Patterns & Antipatterns

New on UXmatters