On May 7, 2016, Joshua Brown was killed when his Tesla Model S collided with an 18-wheeler in Williston, Florida. News of the fatal accident was accompanied by plenty of speculation about what Brown had been doing at the time of the crash. A newly published, 538-page report from the National Transportation Safety Board aims to set the record straight.
The most sobering details come from document 37 of the NTSB publication, entitled "Driver Assistance System - Factual Report". According to data recovered from the Model S, Brown set Autopilot to control the car for most of the trip and took his hands entirely off the wheel. In fact, for the final 37 minutes of the 41-minute trip, Brown's hands rested on the steering wheel just 25 seconds--likely to clear Autopilot's visual and audio warnings:
"For the vast majority of the trip, the AUTOPILOT HANDS ON STATE remained at HANDS REQUIRED NOT DETECTED. Seven times during the course of the trip, the AUTOPILOT HANDS ON STATE transitioned to VISUAL WARNING. During six of these times, the AUTOPILOT HANDS ON STATE transitioned further to CHIME 1 before briefly transitioning to HANDS REQUIRED DETECTED for 1 to 3 seconds. During the course of the trip, approximately 37 minutes passed during which the Autopilot system was actively controlling the automobile in both lane assist and adaptive cruise control. During this period, the AUTOPILOT HANDS ON STATE was in HANDS REQUIRED DETECTED for 25 seconds. For the remainder of this period, the AUTOPILOT HANDS ON STATE was in HANDS REQUIRED NOT DETECTED, or in one of the visual or aural warning states."
The last thing Brown did took place two minutes before the fatal accident: he set the cruise control at 74 mph, nine miles above the posted 65 mph speed limit. Brown made no attempt to apply the brakes or steer his car to avoid the collision before impact with the big-rig, even though the truck should have been visible to him for at least seven seconds.
Who's to blame?
Assigning blame in a case like this is complicated at best.
On a purely legal level, the truck driver would appear to be at fault, as he was crossing the highway at the time. (A medical report also found traces of marijuana in his system, though it's unclear whether or not THC might've played a role in the accident.) The driver heads to court tomorrow on charges of violating the right of way.
However, it's clear from the NTSB's findings that Brown relied heavily on Autopilot that day--too heavily by any standard, including Tesla's. Last September, Tesla updated its onboard software to prevent drivers from abusing the system by disabling Autopilot when drivers don't respond to repeated warnings.
And that raises the question of Autopilot's culpability--and by extension, Tesla's. As soon as Autopilot rolled out in the fall of 2015, Tesla owners began posting videos of themselves pushing the software to its limits. If Tesla hadn't previously suspected that drivers might misuse Autopilot, it should've gotten the picture very, very quickly. And yet, safeguards didn't appear for almost a year. (That's to say nothing of Autopilot's inability to spot a semi crossing the road in front of a car and apply evasive or safety maneuvers.)
That could be a problem for Tesla, especially now that the automaker is being sued over its "half-baked" Autopilot software--and now that the National Highway Traffic Safety Administration has determined that an automaker's self-driving software can be considered a "driver" for legal purposes. So far, however, Brown's family members haven't indicated any plans to sue Tesla themselves.
If you have time, you can read through the NTSB report in its entirety by clicking here.