- Transportation
Are we learning to trust self-drive vehicle technology?
In January 2018 the AAA released the findings of its latest annual survey of driver attitudes in the US. Sixty-three percent of American drivers report feeling afraid to ride in a fully self-driving vehicle. This is a significant portion of the country’s driving population – but it’s a notable decrease from the 78% of drivers who reported similar misgivings in the AAA’s previous survey released in early 2017. The change equates to 20 million more US drivers who trust riding in a self-drive vehicle.
The study findings indicate that Millennial and male drivers are the most trusting of autonomous technologies, with only half reporting they would be afraid to ride in a self-driving car.
Still, developers of autonomous vehicle technology can’t afford to be complacent. Only 13% of drivers in the AAA survey report that they would feel safer sharing the road with a self-driving vehicle, while nearly half (46%) would actually feel less safe. Others say they are indifferent (37%) or unsure (4%).
My colleague Iyad Rahwan, the AT&T Career Development Professor in the MIT Media Lab, has studied this trust issue. In a recent MIT News interview he says that trust in autonomous vehicles “will determine how widely they are adopted by consumers, and how tolerated they are by everyone else.”
In many ways we are entering new territory, Rahwan points out. Self-drive vehicles are not passive objects; they are proactive, have autonomy and are adaptive. And they can learn behaviors that may be different from the ones originally programmed for.
Several difficult challenges will have to be overcome if society is to bridge this trust gap. One is technical: how to build artificial intelligence (AI) systems capable of driving a car safely. There also are legal and regulatory questions to answer. For example, who is liable for different kinds of faults?
A third class of challenges are psychological in nature: people must feel comfortable putting their lives in the hands of AI. Naturally, the nature of the trips they take is important. People moved with little fanfare from human-operated elevators to autonomous ones. Yet few individuals would fly on an autonomous airplane, even though modern airplanes are effectively drones.
Rahwan sees three important psychological barriers. General concern over ethical dilemmas associated with self-drive is one. For example, how to prioritize passenger safety versus the safety of pedestrians. His research suggests that people believe that autonomous vehicles should minimize harm but prefer to buy cars that always prioritize their best interests.
A second issue is that people may overplay the risk of dying in a car crash caused by an autonomous vehicle even if these cars are, on average, safer than conventional models. In other words, people don’t always reason about risk in an unbiased way. For example, compare the headlines created by an air crash, despite its safety (2017 was the seventh straight year that nobody died in a crash on a United States-certificated scheduled airline operating anywhere in the world), to the public’s acceptance of over 40,000 deaths and millions of serious injuries caused by road accidents in the US every year. Furthermore, fear of flying is more prevalent than fear of driving.
A third factor is a lack of transparency about what self-drive vehicles are thinking; it can be difficult for humans to predict the behavior of these machines. A precondition of trust is predictability.
Given these issues, it is critical that the industry makes realistic promises about self-drive technology, believes Rahwan. Setting very high expectations can be a recipe for disaster.
Makers of automated vehicles also must be careful to build cars that appeal to buyers’ sensibilities.
A survey of more than 22,000 consumers carried out by management consulting firm Deloitte, found that US consumer interest in advanced vehicle automation has increased steadily over recent years. Out of the 32 features tested in the study, the top five among US consumers are safety-related. These include technologies that recognize the presence of objects on the road and avoid collisions, and automatically block the driver from dangerous driving situations.
Interestingly, more glitzy features that help buyers to manage their daily lives were perceived as less useful. Examples are the ability to automatically pay tolls and control automated systems in homes. Deloitte argues that one of the main reasons these features are less appealing is that many consumers are already comfortable using their smartphones to accomplish such tasks.
Another factor to consider is the presence of automated trucks on roads. As I argue in my post A Slow Merge for Truck Automation, makers of commercial vehicles must navigate a host of speed bumps before self-drive trucks are plying our highways. It may be a long time before we see fully automated, dock-to-dock truck operation. In the meantime, we will see hybrid solutions, such as E2E (exit-to-exit) highway autonomous truck operation complemented by driver-controlled local delivery, in addition to some operational improvements such as platooning and assisted driving.
Today, it’s hard to imagine that the idea of self-drive trucks on roads inspires more confidence in car drivers. However, as consumer trust in self-drive technology builds and they become more accustomed to automation in their vehicles, the image of a driverless commercial rig will surely become less alarming.
The technical, legal and safety implications of self-drive technology will be discussed in detail at the 2018 Crossroads conference organized by the MIT Center for Transportation & Logistics. The event will take place on April 17, 2018, at MIT, Cambridge, MA. Register for the conference here.
This article was written by Yossi Sheffi, Elisha Gray II Professor of Engineering Systems at MIT, and Director of the MIT Center for Transportation & Logistics.