Tesla relies on poorly-designed safeguards for its self-driving features that invite misuse, and they were a “major factor” in a widely-reported fatal crash last year, says the federal government.
In a meeting discussing findings of its investigation into the crash, the National Transportation Safety Board (NTSB) said Tesla’s inadequate driver monitoring enabled Autopilot use far exceeding the system’s operational limitations. One board member further criticized the company for, “speaking out of both sides of its mouth,” with regard to a lack of warnings in the owner’s manual over improper use.
The crash, in which 40-year old Joshua Brown died instantly when his Tesla Model S struck the bottom of a tractor trailer, sparked a national discussion over the potential misuse of self-driving technologies.
Tesla was cooperative with the NTSB during its investigation, which concluded that the system itself shoulders some of the responsibility for the accident. The Autopilot system relies on hardware that’s not designed to monitor, assess, and account for cross traffic, and thus the Model S did not recognize that the truck was taking a left turn across its path, and took zero evasive action.
Crucially, however, the NTSB doesn’t fault the system’s road monitoring, but rather the failsafes Tesla employs against overdependence.
According to NTSB findings, Brown was not watching a movie, nor was fatigue a contributing factor, contrary to early speculation. He did, however, rely on Autopilot for an extended period of time, and on a type of road for which it is unfit for use. Ultimately, the report says, this was an avoidable incident.
The system prompted Brown to pay attention to the road on six separate occasions, and each his touching of the steering wheel satisfied the system of his monitoring. Tesla has since modified its software to automatically disengage after three of such instances.
In perhaps the harshest phrasing of the NTSB meeting, one member accused Tesla of “speaking out of both sides of their [sic] mouth,” which contributed to driver overuse of the system. He cited contradicting phrases in the owner’s manual, and Tesla’s lack of a formal warning about its systems. The first states that Auto Steer is intended only for use on freeways and highways that feature on and off ramps, while the second discusses speed limitations when driving on residential or undivided roads.
By putting the use limitations in a warning, he said, the issue would be more black and white. “A warning has a specific meaning,” he said, “that if you do this, you might die.” Without the warning, Tesla’s system “invites misuse,” according to the board.
Finally, the NTSB made recommendations—to apply toward all manufacturers—that self-driving systems need to employ more effective methods of monitoring driver awareness, and disengage if the driver is deemed to be paying unsatisfactory attention.
- First, a NHTSA recommendation that in-car entertainment interactions not take longer than two seconds, meaning that prolonged distractions enabled by systems found in Teslas and other cars are unacceptable.
- Second, it recommended that eyesight monitoring replace steering wheel input monitoring, since drivers, like Brown, could fool a system without looking up.
- Third, the NTSB suggested that all manufacturers institute a new failsafe that prevents a self-driving system from engaging on any road for which it’s not designed. With GPS and other monitoring systems, it argued, vehicles know if they’re on a suitable road, sometimes before the driver does.
- Finally, the NTSB pointed out that vehicle-to-vehicle communication, wherein cars silently communicate their position and direction to one another, would have prevented this accident.