NTSB Puts Partial Blame on Tesla and Autopilot in Fatal Model S Crash

NTSB Puts Partial Blame on Tesla and Autopilot in Fatal Model S Crash

Consumer Reports has no financial relationship with advertisers on this site.

Federal crash investigators say Tesla's failure to put limits on its Autopilot system, such as restricting its use to Interstate-like highways, is partly to blame for a deadly crash last year that killed the driver of a Model S sedan that plowed into a truck in Florida.

Data released by Tesla showed that Autopilot, a suite of self-driving features that can help steer and brake, was engaged for 37 minutes before the crash and that the driver had his hands on the wheel for only 25 seconds of that time.

The National Transportation Safety Board (NTSB), which released its findings at a Tuesday hearing, called on Tesla and other automakers to limit how they implement driver-assist features, which could slow development of some self-driving technology but also increase safety as new features are fine-tuned.

“Tesla allowed the driver to use the system outside of the environment for which it was designed, and the system gave far too much leeway to the driver to divert his attention to something other than driving,” NTSB Chairman Robert Sumwalt said. “The result was a collision that should not have happened.”

Safety and consumer groups are likely to join the NTSB’s push for self-driving car tech that stresses safety, adding pressure on the industry to slow down and possibly rethink how it continues its march to completely autonomous vehicles.

The board also took the opportunity Tuesday to list its concerns about the current course of autonomous car technology, including the need for better sensors that can tell when a driver is not paying attention to the road.

The Tesla driver’s inattention to the road, and a truck driver who partially blocked the roadway, were other probable causes of the crash, the safety board said. Those details were in documents the NTSB released in June.

But Tesla should have put more restrictions on the way its Autopilot driver-assist technology could be used, the NTSB said Tuesday. The system was supposed to be limited to highways with exit ramps, and programming using GPS could have locked out the system on the road near Williston, Fla., where the crash happened, the safety board concluded in its findings.

Tesla said in a statement Tuesday that previous analysis by the National Highway Traffic Safety Administration (NHTSA) found that Autopilot significantly improves the safety of its vehicles.

"We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology," the company statement said. "We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”

Tesla’s Autopilot system technology steers, brakes, and provides emergency crash-avoidance maneuvers. It was designed to work on Interstate-type highways at speeds of up to 90 mph.

The cruise control in Joshua Brown’s Model S was set at 74 mph in a 65 mph zone. But Florida’s Highway 27-A, where he was driving, isn’t a true limited-access highway, because it has numerous intersections, driveways and turn lanes.

Before the crash, Brown received seven visual warnings (a steering wheel symbol lighting up on the dash) and six audible chimes alerting him put his hands on the steering wheel.

The NTSB acknowledged that Tesla had warnings in its owner's manuals, software updates and customer agreement about the limitations of Autopilot technology. But some of the warnings were confusing and subject to interpretation, it said.

Automakers have a responsibility to make the consumer aware of technology limitations, said David Friedman, director of cars and product policy and analysis for Consumers Union, the policy and mobilization division of Consumer Reports. There needs to be a safe fallback approach when drivers overestimate their capabilities, he said.

"The NTSB got to the core safety challenge of partially automated, Level 2 vehicles," Friedman said. "These cars may seem like they can drive themselves, but they can't."

It’s a tempting technology for drivers, said Jake Fisher, director of auto testing at Consumer Reports.

“Tesla recommends using its Autopilot system only in certain situations, but it’s not hard to understand why drivers would use it wherever they could,” he said. “Tesla should only allow the system to be used in the situations that it is designed for.”

The safety board noted several areas where technology could be improved, including technology that can better sense if a driver is paying attention to the road.

Tesla's Autopilot relies on steering wheel sensors to tell if a driver is grabbing it from time to time. That system is too easy to fool, the safety board said. General Motors and other automakers are experimenting with facial-recognition cameras to better monitor driver behavior and attention.

Vehicle-to-vehicle communications technology also could have prevented Brown’s crash, the safety board said.

The board reiterated a previous call for federal regulators to issue regulations to equip all new vehicles with the radio transmitters needed to automatically signal their speed and direction to surrounding vehicles. This system could be an important backup to the vehicle’s’ own sensors.

NHTSA has proposed such a regulation, pending review.

As a NHTSA report did earlier this year, the NTSB said the Tesla’s forward-collision warning (FCW) and automatic-emergency braking (AEB) systems were designed only to read cars in front of the vehicle and not objects that might be blocking the roadway.

Most vehicles made by other manufacturers that have FCW and AEB have that same limitation, the safety board said, and manufacturers should do more to inform consumers about that serious limitation of the technology.

The state of event-data recorders that contain information about a crash was a final area of concern for the NTSB.

While advanced-technology cars carry more and more sensors, cameras and computer equipment that record what’s happening on the road, the data aren’t being collected in a way that’s useful for crash investigations.

The limited number of variables cars collect as data now are not enough to understand how self-driving cars are interacting with other traffic, NTSB officials said.

The NTSB said it had to rely on Tesla’s cooperation to retrieve the car data in the Florida crash, and also to interpet it. Even though Tesla cooperated fully with the safety board, federal investigators will need the ability to evaluate crash data independently, the board said.



More from Consumer Reports:
Top pick tires for 2016
Best used cars for $25,000 and less
7 best mattresses for couples

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2017, Consumer Reports, Inc.