The State of UX Design Education, Part 3: The Future of Design Education

July 5, 2021

This is Part 3 of my three-part series on the state of UX design education. In Part 1, I discussed the role of undergraduate education in User Experience, looking at arts and sciences programs versus design programs. In Part 2, I reviewed graduate degree and certificate programs. Now, in Part 3, I’ll look at the future of User Experience. Based on self-reported data from UX professionals and industry trends, I’ll consider what hard and soft skills will be most in demand. I’ll also provide my professional take on where User Experience could and should grow, both in the near term and the future.

Champion Advertisement
Continue Reading…

What UX Professionals Are Doing Now

Let’s start with a recap of current practices in User Experience. The 2019 edition of the Nielsen Norman Group’s report “User Experience Careers: What a Career in UX Looks Like Today”—a free PDF download—validated what many of us might assume about each other. We are a community of largely college-educated UX professionals who are either dedicated to UX research or design or are splitting our time between both research and design and feeling undervalued by the organizations for which we work. This is true for all industries, across all continents—although most survey respondents were in North America or Europe. According to this survey, the number one thing most of us are doing right now is prototyping Web sites, applications, and mobile apps. Many of us wish we were better versed in content strategy. Researchers wish they had skills in visual design, and designers wish they had skills in data analysis.

Only 15% of respondents said they were working on or had worked on designing artificial intelligence (AI) products in the five-year period preceding the survey, while almost 40% of all UX professionals—including both researchers and designers—desired skills in data analytics. The projections in both Paul Petrone’s “The Skills Companies Need Most in 2019—And How to Learn Them” and Bruce M. Anderson’s “The Most In-Demand Hard and Soft Skills of 2020” rank analytical reasoning, artificial intelligence, and User Experience among the top-five skills.

The AIGA (American Institute of Graphic Arts) “Design Point of View Research Initiative” reported similar trends for 2021. One of AIGA’s three top-line findings is: AI is coming faster than expected and has benefited from the COVID-19 crisis. AIGA’s survey respondents indicated that new technologies such as AI and machine learning (49%), augmented reality and virtual reality (38%), collaborative design software (33%), online behavior tracking and modeling (28%), and telepresence and virtual workplaces (25%) are the top emerging trends affecting the design community. Note that, regardless of whether these technologies have AI in their name, AI powers all of the products and solutions the AIGA community mentioned in their report. We have a consensus: UX designers and graphic designers alike believe that the future of design is AI.

Before taking a deep dive into the implications of AI for User Experience, let’s refresh our collective memory on the most-wanted list of soft skills. As in 2019 and 2020, we continue to see demand for creativity, persuasion, collaboration, adaptability, and emotional intelligence in the workplace, according to Jenifer Lambert’s “Top Soft Skills for 2021.” Likely thanks to COVID-19 and the shifting requirements for how and where we’ve worked since early 2020, adaptability and collaboration now come ahead of creativity, emotional intelligence, and persuasion. Of course, for most UX professionals, these are the things we’re already doing.

Respondents to the NN/g survey reported that they found the following skills useful in their own workplace: communication, empathy, active listening, teamwork, collaboration, and problem solving. If we can assume that creativity almost goes without saying for a community of mostly designers, our self-reported competencies map almost exactly to the skills that are in demand across the marketplace. Well done, us!

The Future of User Experience

In her article “Design x Futures = Design Futures?” design strategist and researcher Corina Angheloiu says: “The process of imagining the future is an active, values-laden social practice, which requires a layered approach to surface and challenge dominant patterns in our mental models.”

As UX professionals, we often preach to others about users’ mental models. According to Angheloiu, now is the time to challenge our own mental models of ourselves and our work. With the very definition of User Experience shifting and perhaps narrowing to make room for Customer Experience—a broader discipline that considers interactions that cross screen-based and real-world channels, products, services, and teams in accomplishing a single task—we are simultaneously realizing the long-promised future of the Internet of Things (IoT). There are so many more things to design than Web sites in a world where our watches, cars, doorbells, and thermostats are connected to the Internet. We need to shift our mental models to make room for these new types of interactions. Many digital interactions will be contactless in the future—that is, they will not require using a touchscreen—but instead use our voices, faces, and other biometrics. Plus, all of these interactions will likely be powered by AI and machine-learning algorithms. This means that big data sets—which many of us might not yet understand fully—will govern the products we design in the future. So, if AI plus data analytics is the bandwagon we need to jump on, how should we engage?

Designing apps that are powered by AI, monitoring and assistive devices and wearables, and augmented and virtual-reality (AR/VR) experiences will be new for many of us. But I’m confident that most of us can figure these things out. Since all of us generally fall into one of two categories, recent graduates and experienced UX professionals, we either have some recent academic experience with such new technologies or enough work experience to be able to adapt our process to handle these new, specialized technologies—just as we’ve adapted to other new technologies in the past. This isn’t the first time we’ve had to apply UX methodologies to new media. For instance, seasoned UX professionals can remember the time before the Web and mid-career professionals like me can remember the time before mobile apps. In fact, I recall the first time a prospective client asked me to design for mobile. They asked me how many mobile sites or apps I had designed and whether they could see some samples. I blurted out something like, “None yet, but the process is the same.” I didn’t get that job, but landed plenty of others and eventually became an expert in mobile design because I was right: the process is the same no matter what you’re designing. Collectively, we’ve weathered these shifts in our career expectations and now, according to the NN/g survey, 76% of us are designing for mobile. We’ll manage the next shift as well.

There is an important role for UX designers and researchers to play in designing for AI. The tricky part—the part for which formal education, a community of peers, and mentoring would likely be essential for most designers—is two-fold: understanding the data analysis and analytics behind the AI algorithms and the impacts of designing with big data. The first issue is very tangible, while the second can feel nebulous. However, formal curricula that are dedicated to producing well-rounded humans who are researching and designing on behalf of other humans can and should address designing for both AI and big data.

Formal Education in Data Analysis and Ethics

If you seek them out, you can find many undergraduate, graduate, and post-graduate programs and certificate programs that teach the basics of data analytics and ethics, as well as AI and machine learning—the methods by which machines turn big data sets into patterns. If you’re still in school, yes, you should take at least one of these courses.

If you’re a working UX professional and are interested in a self-paced program, check out Coursera, edX,, or Udemy for courses on the fundamentals of AI and machine learning. CodeSpaces has created a list of the “Top 10 Artificial Intelligence Courses, Certifications, & Classes Online [2021].” went one step further and offers a list of “The Top 11 Big Data and Data Analytics Certifications.” If you’re interested in a formal, post-baccalaureate degree program, look no further than your local Google search. Social media serves me ads daily for these and other programs that are being offered everywhere—from my state college system to various private colleges’ continuing education and executive education departments around the country.

I can’t say which of these programs is the best. You should choose a program based on your own interests and the time and financial investment each would require. But be sure you choose one that explains how big data sets are gathered and analyzed. Because most UX professionals conduct qualitative research, we might not be as familiar with quantitative data techniques, issues such as the dangers of bots filling in survey questions, and methods for data imputation—that is, skipping versus filling in missing data. It’s also important that we understand the differences between custom data, which are commissioned data sets that capture information relevant to a specific product, and synthetic data, which are data based on demographic assumptions that might be faulty or inherently biased, and why the former are better than the latter. It’s also imperative to understand the dangers of data sets that are purchased from tech giants such as Amazon, Google, or Microsoft.

Once you get a handle on the technical aspects of big data, it’s vital that you read at least one chapter of a book or take at least one lesson on data ethics. My deeply held personal and professional opinion: it’s time for the practice of User Experience to take responsibility for the impact of our design work and lead the charge on ethics—including issues relating to both sustainability and diversity, equity, and inclusion (DEI). (For the purposes of this article, I’ll stick to ethics and UX education. Watch for my column on UXmatters, starting later in 2021, for more on DEI and sustainability.) Ethics in AI is an emerging topic of concern, as well as a job role in itself. We should not just wade into this dilemma as Harvard Business Review refers to it; we should be leading the charge. Writing for HBR, Andrew Burt says, “Every AI principle an organization adopts … should also have clear metrics that can be measured and monitored by engineers, data scientists, and legal personnel.” Just as, according to the NN/g survey, many UX professionals still wish we had coding skills, we should now be wishing that we knew more about what happens under the hood of AI and big data so we can make smart research, strategy, and design recommendations and also be a partner in decision-making about the measurement of AI’s impact.

This is not just my opinion—colleges across the US are developing programs that focus on AI and ethics at a rapid pace. Selecting the right program among these new programs could be a bit fraught—if you consider the ethical implications of your education in AI and ethics.

Massachusetts Institute of Technology (MIT) was one of the first to announce the founding of its College of Computing, with a mandate to focus on AI and machine learning. Stephen A. Schwarzman, the CEO and cofounder of Blackstone, a private-equity firm, founded MIT’s new College of Computing, and it will bear his name. Private-equity firms are notorious for dismantling the companies they claim to be saving. Let’s hope that Blackstone does not feed the AI algorithms that come out of MIT back into the company’s already dubious business processes—perhaps for the purpose of perpetuating existing inequities.

More recently, David Greene, the president of Colby College—a small liberal-arts college that is based in rural Maine—announced the founding of an AI institute, whose stated goal is teaching AI in the context of subjects such as “history, gender studies, and biology.” Andrew Davis, President of Davis Selected Advisors, an investment-management company, founded the institute, and it will bear his name. In a conversation with Marketplace’s Molly Wood, Greene said this about AI:

“I think that we need to have a whole cohort of students from different backgrounds and experiences who are really leading AI and not being led by it. So one of the beauties of people who are trained in the liberal arts is that they really understand how to come at a problem from multiple, different angles. They understand history in context. They understand how things play out over time, and not just the near-term impact of something, but what happens over a longer period. How do you look at that impact and understand and predict what might happen if you actually make this decision versus that decision? And right now, because things are so narrow, we’re missing much of that. And I think the more that we have people who are coming from liberal arts backgrounds, who are really raising the kind of questions that will ultimately shape AI in more positive ways, the better off we’ll be.”

I hope Colby achieves his goal and that the learnings from this institute won’t be used to power what Kathy O’Neil calls “Weapons of Math Destruction.” After all, Wall Street specifically and financial services generally are O’Neil’s biggest offenders, according to her discussion of algorithmic discrimination.

Indiana University is kicking off a pilot program this summer that starts even earlier in the education funnel. The program, AI Goes Rural, targets middle-school students in Indiana. Tina Closser, their science, technology, engineering, and math program coordinator said the program will incorporate ethics into its curriculum: “There’s a lot out there about the technical side of AI and how it works, but we will be talking about what AI does and how it affects [students’] lives.” One of the goals of the program is to create a STEM pipeline to the Department of the Defense, which is funding the program via the Naval Surface Warfare Center. As a parent, I wonder whether parents can opt out of this program on behalf of their kids. I also wonder whether the Trolley Problem should be a mandatory thought exercise in every middle school, high school, and college, starting yesterday.

Someone Has to Advocate for Ethics

Someone has to take on the role of advocating for ethical data analytics and AI. I believe that UX professionals are best positioned to fill this role. We should be challenging both people’s mental models of our work and our values. After all, if we are the voice of the user, we should be speaking up for the users we hear from in surveys and user interviews, the users we see during observational studies and usability testing, and the users we cannot see in the big data sets that are driving the algorithms and powering the applications that we’re designing. As David Greene has said, we should be leading AI, not being led by it. 

Principal/Founder, Black Pepper

Brookline, Massachusetts, USA

Sarah PagliaccioSarah is Founder and UX Guru at Black Pepper, a digital studio that provides customer research, UX design, and usability consulting services. Sarah designs complex mobile and desktop Web apps for healthcare, financial services, not-for-profit organizations, and higher-education organizations. Her focus is on UX best practices, creating repeatable design patterns, and accessible design solutions that are based on data-driven user research. Sarah researches and writes about bias in artificial intelligence (AI)—harnessing big data and machine learning to improve the UX design process—and Shakespeare. Sarah teaches user-centered design and interaction design at the Brandeis University Rabb School of Graduate Professional Studies and Lesley University College of Art + Design.  Read More

Other Columns by Sarah Pagliaccio

Other Articles on Professional Development

New on UXmatters