Column written for Interactions, volume 14, issue 4. © CACM, 2007. This is the author's version of the work. It is posted here by permission of ACM for your personal use. It may be redistributed for non-commercial use only, provided this paragraph is included.
In a previous column I discussed the re-emergence of command line language. Once these were the ways we used our operating systems and applications. Now they are re-emerging within search engines. They are hidden, not easy to learn about, but I expect them to grow in power and, over time, become the dominant means of interaction.
In this column I talk of a second trend, one that also has much earlier origins: the return to physical controls and devices. In the theoretical fields that underlie our field, this is called embodiment: see Paul Dourish's book "Where the Action Is." But the trend is far more extensive than is covered by research on tangible objects, and somewhat different from the philosophical foundations implied by embodiment, so I use the term "physicality."
Physicality: the return to physical devices, where we control things by physical body movement, by turning, moving, and manipulating appropriate mechanical devices;
We have evolved as physical creatures. We live in a complex, three-dimensional world filled with physical objects. We are analog beings in an artificial world of digital devices, devices that abstract what is powerful and good from the physical world and turn it into information spaces, usually in arbitrary ways. These new approaches put the body back into the picture. They require us to control through physical action, which means through mechanical devices, not electronic or graphic, through physical rather than virtual.
At one point, when digital circuits took over the control of such mundane objects as automobile radios, physical controls were removed. Ugh. The most recent advances in automobile radios and other audio equipment is to re-introduce knobs for tuning and loudness control. Even BMW in its attempt to replace all knobs, buttons and controls by a single control knob and complex menu hierarchy has been forced to bring back physical switches and knobs.
Perhaps the most dramatic example of this trend is the Nintendo Wii game machine. Here, physical movement is the major method for interacting with its video games. The Wii has completely changed the game world: kudos to Nintendo! Tablet computers are slowly inching toward respectability, because the joy of being able to write and draw directly on a page or upon images is powerful, especially when coupled with a machine that also allows the more standard mouse-based pointing and typing inputs to work just as before: the result is the best of all worlds.
Physical devices have immediate design virtues, but they require new rules of engagement than we are used to with the typical mouse movements and clicks of the traditional keyboard and mouse interface. Designers have to learn how to translate the mechanical actions and directness into control of the task.
As we switch to tangible objects and physical controls, new principles of interaction have to be learned, old ones discarded. With the Wii, developers discovered that former methods didn't always apply. Thus, in traditional game hardware, when one wants an action to take place, the player pushes a button. With the Wii, the action depends upon the situation. To release a bowling ball, for example, one releases the button push. It makes sense when I write it, but I suspect the bowling game designers discovered this through trial and error, plus a flash of insight. Not all of the games for Wii have yet understood the new principles. This will provide fertile ground for researchers in HCI.
Physical devices, what a breakthrough! But wait a minute, isn't this where the machine age started, with mechanical devices and controls? Yup. Just as command line interfaces, now available in the quasi, natural language format now used within search engines, is also a throwback to earlier times but with improvements, so too is the return to physical controls a throwback to the earlier mechanical era, with improvements. It's about time we returned to our roots, to something intended for people.
One interesting implication of the movement toward physical interfaces is that the dominant discipline for the technology of human interaction might also move from Computer Science back to Mechanical Engineering (which is really where it started many years ago). New disciplines will have to be learned, for example control theory and mechatronics. Mechatronics is the combination of mechanical engineering mechanisms with electronics (and computer science). Where does one learn mechatronics? CS departments shun the mechanical, so you won't find it taught there. The social sciences shun the engineering, so you won't find it there. Mechanical Engineering departments almost always teach it, but devoid of contact with human beings. Training in mechatronics is common in forward-looking design schools, because they know that the design of future things will include a hefty dose of intelligent mechanics and electronics, and in the best of design schools, consideration of the people for whom the designs are intended is always a prime consideration.
Part of the future of design is that of smart, intelligent devices, where almost everything will have a microprocessor built in, plus motors, actuators and a rich assortment of sensors, transducers, and communication devices. If the future is a return to mechanical systems, mechatronics is one of the key technological underpinnings of their operation. Mechatronics taught with an understanding of how people will interact with the resulting devices. Taught with an understanding of all the critical areas of design: mechanical, computer and electrical engineering; business, social sciences, business, and aesthetics. But where is one to gain skills in all of these areas? Within the university, each component is a separate discipline, sometimes not even on speaking terms with the others, a social separation that unfortunately can persist into the workplace. Not in the arts or sciences, for they are often dismissive of both applications and business. Similarly, business schools lack the emphasis on technology and aesthetics and, in far too many cases, on the social side. Design schools and departments have their own deficiencies, sometimes attempting to cover the entire gamut, but without the depth that comes from within the discipline. Fortunately, many individuals have put together the requisite skills. Time for our educational institutions to catch up.
Physicality: the return to mechanical controls, coupled with intelligent, embedded processors and communication. That is one path back to the future.
Don Norman wears many hats, including co-founder of the Nielsen Norman group, Professor at Northwestern University where he co-directs the Segal Design Institute, and author of The Design of Future Things (Nov 2007). He lives at www.jnd.org.
- All Books
- The Design of Everyday Things, Revised and Expanded Edition
- Living with complexity
- The Design of Future Things
- Emotional Design: Why we love (or hate) everyday things
- The invisible computer
- Things That Make us Smart: Defending Human Attributes in the Age of the Machine
- Turn Signals Are the Facial Expressions of Automobiles
- The Design of Everyday Things