Top

How Science Will Shape Smartware

Smartware

The evolution of computing

January 22, 2018

The push for education in STEM (Science, Technology, Engineering, and Mathematics) by governments and in business is now more than a decade old. Over this period of time, society has embraced intelligence and welcomed geekdom—at least on the surface. We see this in aspects of popular culture that demonize bullying, glorify inclusivity, and make it admirable, if not cool, to be smart.

At the same time, scientific research is experiencing a golden age. One reason for this boom: the ubiquity of the Internet, which revolutionized communication and information dissemination in the 1990s and has fostered greater cross-disciplinary involvement among researchers in disparate fields. New tools and technologies have galvanized the cross-pollination of ideas and revealed the intricacies and secrets of the human animal as never before. Over just the last two decades, we have developed plausible solutions for questions that have beguiled us for all of human history. This is an incredible time for scientific discovery and insight—one that will also have profound impacts on the everyday technologies that will surround us in the 2020s.

Champion Advertisement
Continue Reading…

In our column on smartware, we have been exploring the impact of emerging technologies and science on product design and user experience. While, in this particular installment, we’ll focus specifically on the scientific aspects, smartware leverages an amalgam of technology advances that are coming of age at approximately the same time—together creating a powerful ecosystem. Prominent among these technologies are artificial intelligence (AI), the Internet of Things (IoT), mixed-reality environments, and additive fabrication—which is more commonly known as 3D printing. While the near-term future that these smartware technologies will usher in won’t necessarily realize the popular-culture promise of sentient robots or AI super-intelligence, it will be very different from today’s world, in ways both magical and mysterious. Humans’ ability to understand ourselves through scientific advances—what we call understanding us—translates directly into these capabilities.

Now, let’s look at two important fields that are generating precise data that supports our advanced understanding of human capabilities: genomics and neuroscience.

The Genomics Revolution

While the physician and biologist Friedrich Miescher, shown in Figure 1, first isolated DNA in 1869, we didn’t understand its basic helix structure until James D. Watson and Francis H. C. Crick articulated it in 1953, building on the prior work of Rosalind Franklin and others. It wasn’t until the 1970s that Fred Sanger sequenced the genome of a virus. Then, by 2007, the Human Genome Project had completed the sequencing of the entire human genome. So, while scientists generally recognize Watson and Crick’s work in the 1950s as the beginning of the field of genomics, for only barely more than a decade has it been possible to map the entirety of the human genome. Since then, the field has progressed at a blinding pace.

Figure 1—Friedrich Miescher
Friedrich Miescher

Image source: Wikimedia Commons

In addition to figuring out how to map the human genome—and how to do so more quickly and affordably in the future—scientists are learning about the underlying structure and relationships of the genes they are mapping. This manifests in many important ways.

For example, it is now possible to predict people’s likelihood of suffering from a particular condition simply by reading their DNA. Scientists are even developing treatments based on an individual’s own unique genome. Services such as Strata Oncology conduct advanced genetic tests for patients, then match them with clinical trials for drugs that are designed, not just for a patient’s condition, but specifically to target the DNA of the tumor tissue. They are curing disease at the deepest possible level, attacking the disease by understanding its underlying genomics and the genomics of particular patients.

The goal of curing cancer has been an important focus of scientific research—with a history spanning 250 years, from the discoveries of Percivall Pott who, in 1775, demonstrated that environmental factors may cause cancer; to the groundbreaking work, in the 1950s and 1960s, of the father of modern chemotherapy, pediatric pathologist Sidney Farber; to the National Institutes of Health’s (NIH) Cancer Moonshot that former US President Barack Obama and Vice President Joe Biden championed. It is in the genomic specificity of different types of cancer and the way they intersect with our unique DNA that cures will finally manifest. This approach is revolutionary.

But wait, there’s more! Technology for changing parts of a person’s genome already exists. At the beginning of this decade, scientists Jennifer Doudna and Emmanuelle Charpentier first proposed the use of CRISPR/Cas9, visualized in Figure 2, for programmable gene editing. Teams led by Feng Zhang and George Church separately developed this technology for use in humans—although the technology is still very much in its early stages. However, the scientific community has been slow to actually deploy this method on humans, in part because of ethical and safety concerns. Nevertheless, just last year, scientists used the technique to correct an inherited heart condition in a human embryo.

For now, scientists are being careful to treat this advance as just one step in the scientific process—not the dawn of the designer baby. But the reality is that it is now possible to edit an embryo’s DNA to remove something undesirable and perhaps replace it with something more desirable. Whether in a place like the United States, where wealthy oligarchs can ultimately have whatever their billions of dollars can buy, or in a place like China, where the focus is on progress, with less consideration of ethical concerns, it’s only a matter of time until designer babies are actually possible—at least in some limited way.

Figure 2—CRISPR/Cas9
CRISPR/Cas9

Image source: Ernesto del Aguila III, National Human Genome Research Institute (NHGRI), (CC BY-NC 2.0)

Understanding the Human Mind

Of course, genomics is not the only scientific field that is experiencing remarkable advancement today. Neuroscience has developed into what we might call the new physics—a gravity well for many different scientific fields. Ultimately, we’ll get answers to age-old questions about the whys, whats, and wherefores of the human animal. Interest in and study of the brain is as old as recorded human history. In fact, the earliest reference to the brain dates to 17th century BC in Egypt. For a long time, we’ve understood the brain’s central role in how we function, but extending that to actually understanding essential truths about the human condition has remained elusive until just recently.

As with genomics, the origins of modern neuroscience trace back to the 1950s. In 1952, the Hodgkin-Huxley Method provided a mathematical model for the transmission of electrical signals in a neuron. A decade later, Bernard Katz modeled neurotransmission across synapses. Then, in 1981, the Morris-Lecar model federated various independent models into something more complete. This litany of names and dates represents an ever-expanding, but ultimately incomplete view of our learnings about how the mind works. However, over the last decade, neuroscience has advanced to the point where we not only understand the biological basis of how the brain works, but also its relationship to actual human behavior. This has given us insights into fundamental questions such as the cause of depression, what causes Alzheimer’s, how the Internet is taking over human memory, how an expectant mother’s diet impacts her future offspring, and how we can erase or enhance memories.

An example of the sheer volume of data that is available to neuroscientists for analysis today, “Self Reflected Under White, Red, and Violet Light,” shown in Figure 3, is Greg Dunn, Brian Edwards, and Will Drinker’s award-winning visualization of the human brain, which drew upon numerous information sources to create this amazing image of 500,000 neurons.

Neuroscience is also impacting other fields. The so-called soft sciences—fields such as psychology, sociology, and philosophy—can lattice this tsunami of new, scientifically factual discoveries about how the human animal behaves into their work on the behavior of people, societies, and even the entire world.

Figure 3—A visualization of the human brain
A visualization of the human brain

Image source: The 2017 Vizzies—The National Science Foundation (NSF) Visual Challenge

Building the Identity Graph

Understanding the human genome and neurological system is already leading to hyper-specific cures to diseases, as well as more sophisticated problem solving around how and why we behave in certain ways that we can leverage for greater well-being. The future will bring these insights out of the realm of our healthcare system and therapists into that of the consumer products we use every day. We call this evolution the identity graph. It will become possible—using essential data about human beings that could, with our consent, approach a genomic or neurological level—to enable the smartware environment in which we live to adapt to the very essence of who we are in an absolute way. This encompasses the makeup of our DNA, as well as who we are specifically, from moment to moment.

Altruistically, we can see how this might bestow holistically better lives upon us—in ways that range from our relationships, to how we spend our time, to finding the right work environment. More practically, this could optimize the ability of for-profit companies to tempt us to spend more of our money on their offerings by tapping into the biological underpinnings that are unique to each and every one of us. By the 2020s or 2030s, the precision, depth, and power of what our identity graph makes possible for each of us will seem like magic in comparison to today’s early algorithmic methods—or the clumsy, one-to-one marketing efforts of the 1990s.

In our next column, we’ll look at how the precise information from our identity graph—along with the powerful capabilities of artificial intelligence, the IoT, mixed-reality environments, and additive fabrication—will enable new interactions and user experiences as smartware evolves. 

Managing Director, SciStories LLC

Co-owner of Genius Games LLC

Boston, Massachusetts, USA

Dirk KnemeyerAs a social futurist, Dirk envisions solutions to system-level problems at the intersection of humanity, technology, and society. He is currently the managing director of SciStories LLC, a design agency working with biotech startups and research scientists. In addition to leading SciStories, Dirk is a co-owner of Genius Games LLC, a publisher of science and history games. He also cohosts and produces Creative Next, a podcast and research project exploring the future of creative work. Dirk has been a design entrepreneur for over 15 years, has raised institutional venture funding, and has enjoyed two successful exits. He earned a Master of Arts from the prestigious Popular Culture program at Bowling Green.  Read More

Principal at GoInvo

Boston, Massachusetts, USA

Jonathan FollettAt GoInvo, a healthcare design and innovation firm, Jon leads the company’s emerging technologies practice, working with clients such as Partners HealthCare, the Personal Genome Project, and Walgreens. Articles in The Atlantic, Forbes, The Huffington Post, and WIRED have featured his work. Jon has written or contributed to half a dozen non-fiction books on design, technology, and popular culture. He was the editor for O’Reilly Media’s Designing for Emerging Technologies, which came out in 2014. One of the first UX books of its kind, the work offers a glimpse into what future interactions and user experiences may be for rapidly developing technologies such as genomics, nano printers, or workforce robotics. Jon’s articles on UX and information design have been translated into Russian, Chinese, Spanish, Polish, and Portuguese. Jon has also coauthored a series of alt-culture books on UFOs and millennial madness and coauthored a science-fiction novel for young readers with New York Times bestselling author Matthew Holm, Marvin and the Moths, which Scholastic published in 2016. Jon holds a Bachelor’s degree in Advertising, with an English Minor, from Boston University.  Read More

Other Columns by Dirk Knemeyer

Other Columns by Jonathan Follett

Other Articles on Experience Trends

New on UXmatters