Over the past couple of weeks, Tesla has made headlines for its autopilot software, which allows the company's Model S sedan drive itself under certain conditions.
While fans and futurists have hailed this as the arrival of the fully autonomous car, some reports from owners suggest that the technology is far from perfect. (That's why Tesla says that even in autopilot mode, drivers should keep their hands on the steering wheel.)
By coincidence, late last week, Michael Sivak and Brandon Schoettle from the University of Michigan Transportation Research Institute published a new report entitled "Should We Require Licensing Tests and Graduated Licensing for Self-Driving Vehicles?" In it, they discuss some of the practical and regulatory matters involved in making autonomous cars road-ready.
To do so, they look at the driving skills that autonomous vehicles need to master. Could a graduated licensing system, like the one that human drivers undergo, be applied to autonomous cars?
While Sivak and Schoettle acknowledge the importance of GDL programs for young motorists, they say that the same process doesn't always make sense for self-driving vehicles. For example, while some GDL programs prevent novices from driving at night, the same approach doesn't work for autonomous cars:
"For self-driving vehicles, experience with daytime driving does not improve nighttime performance. Instead, good nighttime performance requires everything that good daytime performance does, plus sensors that provide the necessary information even at low levels of illumination. Thus, the GDL approach would not be appropriate here either."
For the most part, Sivak and Schoettle argue that the best approach for licensing self-driving vehicles will be to ensure that they work in all conditions. That includes at night and during inclement weather, which remain problem spots for many autonomous systems being tested today.
However, the biggest obstacle that stands between autonomous cars and the open road is, ironically, the computer that handles the driving. That's because, as great as computers are at making calculations, they lack some essential human traits -- for example, pattern recognition, moral reasoning, and aggression:
Pattern recognition is a major shortcoming of most computers, which is why CAPTCHA systems work so well on websites. However, pattern recognition will be a hugely important skill for autonomous vehicles because it will allow them to make sense of unfamiliar terrain. That'll come in especially handy in construction zones and when unusual or unexpected objects land in the roadway, like downed powerlines, fallen tree branches, or runaway shopping carts.
Moral reasoning is equally important and raises many uncomfortable issues. For example, if an autonomous car finds itself in a situation where a crash is unavoidable, will it aim for the nearest lamppost, killing the lone driver, or will it drive onto a crowded sidewalk, potentially sparing the driver's life but risking many more? To make that choice, a self-driving vehicle will have to make some unpleasant decisions -- some that drivers probably wouldn't like.
But at this point, the most pressing problem could be teaching autonomous cars to drive as aggressively as humans. Sivak and Schoettle cite a report on one of Google's self-driving vehicles, which had difficulty exiting a roundabout because "it decided the safest thing to do was to keep going around". Human drivers who make rolling stops and drive faster than the posted speed limits cause similar dilemmas. As the researchers note:
" The fact that human drivers are relatively lax about many relevant laws and regulations creates a real quandary: Should manufacturers be allowed to program a vehicle to willfully break applicable laws? If so, which laws and to what extent? A related question—one which is connected to the main issue of this paper—is as follows: If, eventually, there would be testing procedures for self-driving vehicles, would one pass by obeying the laws or by disobeying the laws?"
While Sivak and Schoettle don't have any hard and fast solutions to the problem of "licensing" autonomous vehicles, they do a great job of thinking about hurdles the cars will have to overcome. You can find an abstract of the report here.