Bring An Equalizer to the Fight. Choose a Firm That Was Created to Advocate for Victims.

NTSB Faults Tesla in Self-Driving Vehicle Crash

TeslaThe National Transportation Safety Board (NTSB) recently concluded its investigation into a fatal crash involving Tesla’s Autopilot system. The agency concluded that the technology, which is capable of autonomously steering and controlling the vehicle, “played a major role” in the accident.

As previously discussed on this San Diego Injury Blog, the Tesla vehicle was in Autopilot when it slammed into a tractor-trailer, killing a 40-year-old Florida man. Joshua Brown’s Model S went underneath the trailer of a truck that had turned left in front of the vehicle. According to Tesla, "neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."

Autopilot is also suspected of contributing to several other serious car accidents around the world. In China, a Model S inexplicably slammed into a road sweeper without braking and killed its 23-year-old driver. In Germany, a Tesla vehicle smashed into a construction barrier while traveling at a high rate of speed, seriously injuring the driver.

While Tesla has publicly emphasized that Autopilot "is new technology and still in a public beta phase,” the feature is currently available on 25,000 Tesla Motors Model S cars. Critics also contend that the car maker is overselling the capabilities of the self-driving feature.

Investigation into Florida Tesla Crash

The NTSB agreed with the National Highway Traffic Safety Administration’s conclusion that the self-driving technology performed as intended and did not suffer any malfunctions that contributed to the crash. However, its report further found that the system’s design is flawed because it allows drivers to rely on it too heavily.

“How many more lives must be lost and crashes happen before Tesla Chairman Elon Musk will take responsibility and act to protect our safety?” asked John M. Simpson, Consumer Watchdog Privacy Project Director. “The problem is that Tesla encourages people to believe Autopilot can do more than it really can,” Simpson added. “The name itself is a huge problem.”

The National Highway Traffic Safety Administration is currently investigating the Florida crash. In California, the Department of Motor Vehicles is working to amend its autonomous vehicle regulations to prohibit the use of terms like “autopilot” in vehicles that still require significant human involvement to operate safely. Under the draft law:

A vehicle cannot be advertised as autonomous in California unless it meets the definition of “autonomous” specified in Vehicle Code §38750 and the autonomous vehicle regulations. The terms “self-driving”, “automated”, “autopilot”, and other statements that lead a reasonable person to believe a vehicle is autonomous constitute advertising regulated by the truth-in-advertising provisions in the Vehicle Code.

We will be closely tracking these regulations and will post updates as they become available.

According to NTSB Chairman Robert Sumwalt:

“Tesla's system worked as designed, but it was designed to perform limited tasks in a limited range of environments. Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far too much leeway to the driver to divert his attention to something other than driving. The result was a collision that, frankly, should have never happened.

The fatal crash highlights that while many auto manufacturers are working to incorporate self-driving aids into new vehicles, the technology is still in its infancy. As a result, there are risks to drivers as well as everyone else who shares the road with these vehicles.”

If you or someone you love has suffered serious injury in a California motor vehicle crash, don’t hesitate to contact a San Diego personal injury lawyer at the Law Offices of Robert Vaage for a free consultation.