NTSB Finds Tesla Autopilot Partly to Blame for Fatal Crash | Edmunds

NTSB Finds Tesla Autopilot Partly to Blame for Fatal Crash


The U.S. National Transportation Safety Board has concluded that the crash that killed the driver of a 2015 Tesla Model S electric sedan in Florida last year was at least partly due to the limitations of "system safeguards" on the vehicle's Autopilot semiautonomous feature.

According to Reuters, NTSB chairman Robert Sumwalt said: "Tesla allowed the driver to use the system outside of the environment for which it was designed and the system gave far too much leeway to the driver to divert his attention."

Autopilot is designed to control the steering and speed of a vehicle driving on a highway with exit and entrance ramps, well-defined medians and clear lane markings. Since it's not intended to have full self-driving capability, the system alerts the driver repeatedly with visual and audible warnings to pay attention and keep his or her hands on the steering wheel.

But in January, both the NTSB and the National Highway Transportation Safety Administration determined that Joshua Brown, the driver of the Model S, had set the vehicle's cruise control at 74 mph (higher than the 65-mph limit), was not driving on a controlled-access highway and ignored the system's warnings to remain alert.

So when a semitruck turned left across the path of Brown's vehicle, the Autopilot system failed to respond because it's not designed to detect crossing traffic, and the driver did not apply the brakes or otherwise take control. As a result, the Model S crashed into the side of the truck, killing Brown instantly.

At the time, NHTSA concluded that the vehicle had no defects and that Autopilot had performed as designed. And NTSB attributed the crash to driver error.

Now, however, NTSB says that Autopilot's "operational design" was at least a contributing factor to the crash because, as configured at the time, it allowed drivers to keep their hands off the steering wheel and otherwise let their attention wander from the road for extended periods of time. In other words, drivers can override or ignore warnings from the system, putting them at risk for collisions.

NTSB has devised a number of recommendations for automakers developing partially autonomous vehicles. These include going beyond simple alerts to ensure driver engagement, blocking the use of a self-driving system beyond the limits of its design, and making sure these systems are only used on specific types of roads.

Tesla responded that it would evaluate the agency's recommendations and "will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."

Tesla has continuously updated Autopilot since its introduction. For example, the latest version doesn't just give warnings; it will shut off completely if the driver doesn't take control of the wheel.

Although the Tesla Autopilot crash prompted continued NTSB scrutiny, the agency stressed that its recommendations apply to other automakers as well. It specifically mentioned Audi, BMW, Infiniti, Mercedes-Benz and Volvo, suggesting that their semiautonomous systems should also receive upgraded warnings and features that prevent drivers from using them improperly.

Leave a Comment
ADVERTISEMENT
ADVERTISEMENT