Don Norman: Designing For People

Nielsen Norman Group

The driverless car revolution must proceed with caution

Article written for and published in the San Diego Union-Tribune, Saturday, March 24, 2018. Slighly modified April 1, 2018 to delete my erroneous statistical argument.

Imperfect automation, continually getting better? Or distracted drivers, continually getting worse? Choose.

Recently, one of Uber's autonomous automobiles was involved in an accident where a pedestrian was killed. During the three years that my colleagues and I have been doing research on self-driving cars, this is the first death. Compare this single death with the 120,000 people who have been killed in automobile accidents in the United States in that same period: roughly 100 people each day. However, trying to do a statistical argument with only 1 incident is foolhardy (I did it in an earlier version of this note, and I was wrong to do so). Nonetheless, the several deaths that have now taken place with highly automated cars serve as a warning. Although the other deaths were not with self-driving cars,  their drivers treated them as if they were. Over trust is dangerous.

What lesson should we learn? Automobile manufacturers are rushing to add more and more automation to their existing cars, promising to have fully automated vehicles within a few years. They need to slow down.

Why should we have fully automated cars? Because they have many benefits: Less deaths, injuries, and accidents with no more drunk or distracted driving; more efficient commuting, and increased mobility for those who cannot or do not wish to drive.

However, we need caution. New technology is always problematical. It can take years - decades - to make technology safe and reliable in difficult environmental conditions, unexpected situations, and the ever-unpredictable behavior of pedestrians, bicyclists, motorcyclists, and skateboarders (to name a few). At the Design Lab at the University of California, San Diego our researchers have observed skateboarders and bicyclists zooming down sidewalks into the streets and people crossing city streets with eyes firmly fixed on phones or tablets. Driving on major highways is easy compared to urban and city streets.

Today tests are performed with "safety drivers," people inside the vehicle ready to take over if something goes wrong. This is a false hope. Almost 50 years of research shows that people are not good at monitoring for long hours, and then suddenly leap into action when difficulties arise.

In the Uber accident, the video indicates that the car did not see the woman and bicycle, the safety driver was not looking until roughly 2 seconds before the crash, and the woman walking the bicycle was jaywalking and did not appear to notice the automobile until it was too late. (even though it was night, and the car's headlights were on). But even had the driver been watching the road, the driver might not have been able to react quickly enough. Studies have shown that it takes up to 20 seconds for safety drivers to respond: The car was traveling at 40mph which means 60 feet/second.

The Federal Drug Administration (FDA) requires the medical industry to behave cautiously in their introduction of new devices and medication: new treatments or devices can sometimes do patients more harm than good, so each must undergo clinical trials before they can be released. We need something similar for self-driving autos: a neutral, trusted agency to certify safety before we let them on the roads. This could be a government agency or a high-quality private company such as UL (a global safety consulting and certification company formerly known as Underwriters laboratory). Insisting on a safety certificate might put a salutary slow up in today's mad race.

The potential for autonomous vehicles to produce tremendous saving of lives and injuries while increasing our quality of life provides strong support for the eventual introduction of fully automated vehicles. Nonetheless, just as new medicines and medical devices enhance lives, but their introduction is done cautiously, with carefully controlled tests, we must do the same with our autonomous vehicles. I look forward to the day when my self driving car will free me from the tedium and danger of driving. But that day is not yet here.

Norman is professor and director of the Design Lab at the University of California San Diego, which among other things, studies autonomous driving. Norman is a former VP of Apple and author of the book "Design of Future Things" that discusses autonomous cars as well as the best-selling "Design of Everyday Things." Some of the research of the Design Lab is supported by the automobile industry (Nissan, Toyota, and others currently being discussed).