The 'Autopilot' system used in Tesla automobiles was partly to blame for a fatal May 2016 crash in Florida where the electric car slammed into a truck, a US safety report concluded today.
The driver’s “overreliance” on the Tesla system — designed as a semi-autonomous diving system to be used with a human operator — permitted “prolonged disengagement” that led to the collision with the freight trailer, the National Transportation Safety Board report said.
The system also permitted the driver to use the system on a road not intended for Autopilot, NTSB officials said.
“Tesla allowed the driver to use the system outside the environment for which it was designed,” said NTSB chair Robert Sumwalt.
“And the system gave too much leeway to the driver to divert his attention to something other than driving. The result was a collision that frankly should never have happened.”
The report looked at factors behind the accident, in which 40-year-old Tesla enthusiast Joshua Brown died after failing to respond to seven warnings from the Tesla system to return to active driving.
Since the accident, Tesla has updated its Autopilot system to shut off if a driver fails to respond after three warnings. NTSB staff said during a hearing that this change would have disabled Autopilot in the Florida crash.
However, NTSB staff pointed to other defects in Autopilot that have not been addressed, such as the fact that Brown was driving on a road that was not intended to ride on Autopilot.
The Tesla system, despite being able to take readings of speed limits and other key factors, does not shut off Autopilot on such roads.
“It is not geofenced not to happen in certain locations,” said NTSB analyst Deborah Bruce.
At the time of the crash, Brown was travelling at 74 miles (118 km) per hour in an area where the speed limit was 65 mph, when he collided with the semitrailer making a left turn. The Tesla went under the trailer and smashed into a utility pole, according to investigators.
NTSB staff did not reach conclusions as to the reason for the Tesla driver’s inattention, or for why the truck driver did not slow down.
The Florida Highway Patrol found trace amounts of marijuana in the truck driver’s blood, but the NTSB was unable to determine if the drug played a role in the accident.
The report has been closely watched because Tesla is at the forefront of automated driving, a technology the potential to enhance overall mobility, but also fraught with potential safety problems.
Tesla said in a statement after the NTSB report that its Autopilot system “significantly increases safety, as NHTSA has found that it reduces accident rates by 40 per cent.”
The automaker added that “we appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology.”
The company said it would “continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”
In January, an investigation by the US Transportation Department said it found no “defect” in the Tesla Autopilot system.