Smartware are computing systems that require little active user input, integrate the digital and physical worlds, and continually learn on their own. Now, in this, the final edition of our column on smartware, we’ll consider how the powerful capabilities of smartware will enable new interactions and user experiences that, over time, will become seamlessly integrated into our digital lives. Read More
In this column on the future of computing, we’ll look at how a handful of advances—including artificial intelligence (AI), the Internet of Things (IoT), sciences of human understanding like neuroscience and genomics, and emerging delivery platforms such as 3D printers and virtual-reality (VR) headsets—will come together to transform software and hardware into something new that we’re calling smartware.
Smartware are computing systems that require little active user input, integrate the digital and physical worlds, and continually learn on their own.
A Tribute to Dead Machines
Humanity and technology are inseparable. Not only is technology present in every facet of civilization, it even predates archaeological history. Each time we think we’ve identified the earliest cave paintings—such as that by an unknown artist in Figure 1—stone tools, or use of wood for fuel, some archaeologist finds evidence that people started creating or using them even earlier. Indeed, while our own species, Homo sapiens, is only about 300,000 years old, the earliest stone tools are more than 3 million years old! Even before we were what we now call human, we were making technology. Read More
Today, Artificial Intelligence (AI) and machine learning are coming of age at the same time as a cluster of advances in the sciences, especially neuroscience and genomics, and other technologies such as the Internet of Things (IoT); additive fabrication, or 3D printing; and virtual reality (VR). Together, these technologies promise to create a radical inflection point at the same scale as personal computers in the 1970s, the Internet in the 1990s, and mobile computing in the 2000s. We call these collective technologies smartware.
Artificial intelligence (AI) is currently having a moment. In fact, the field has had many moments since its inception in 1956, with flurries of media excitement tempered only by the sober reality of what is actually possible. We have made massive strides in machine learning—the approach to AI that focuses on writing software that can independently learn and develop long after its human programmers have finished their coding—which is transforming personal computing as we know it. Read More