Advertisement

NTSB slams Tesla, Apple and regulators over a fatal Autopilot crash

Share via

The nation’s top safety investigator slammed Tesla on Tuesday for failing to take adequate measures to prevent “foreseeable abuse” of its Autopilot driver-assistance technology, in a hearing into the fatal 2018 crash of a Tesla Model X SUV in Mountain View, Calif.

The National Transportation Safety Board said 38-year-old Walter Huang, an Apple software engineer, had Autopilot engaged in his 2018 Tesla Model X and was playing a video game on his iPhone when the car crashed into a defective safety barrier on U.S. Highway 101.

For the record:

6:15 p.m. Feb. 25, 2020An earlier version of this article misstated the date of Huang’s crash.

The board also blamed the highway safety arm of the U.S. Department of Transportation for failing to properly regulate rapidly evolving robot-car technology.

Advertisement

“Government regulators have provided scant oversight” of Autopilot and self-drive systems from other manufacturers, said NTSB Chairman Robert Sumwalt at a safety board meeting in Washington, D.C. The board adopted a long list of measures meant to reduce such accidents as “partially automated driving” technologies become more popular in new vehicles.

Tesla’s Autopilot and other driver-assist systems might enhance safety, but manufacturers can keep the data under wraps.

Sumwalt pointed out that in 2017, the NTSB recommended automakers design driver-assist systems to prevent driver inattention and misuse. Automakers including Volkswagen, Nissan and BMW reported on their attempts to meet the recommendations, but Tesla never got back to the NTSB.

“Sadly, one manufacturer has ignored us, and that manufacturer is Tesla,” Sumwalt said Tuesday. “We’ve heard nothing; we’re still waiting.”

Advertisement

Tesla couldn’t be reached for comment Tuesday.

Even Huang’s employer, Apple, came in for scathing criticism. Sumwalt noted that the Apple iPhone’s “do not disturb while driving” feature is optional, not a default setting. He said companies including Apple need to do more to encourage their employees not to use smartphones while driving.

Sumwalt noted that the federal Occupational Safety and Health Administration recommends that companies forbid employees from using personal devices for work or email when driving, whether the trip is for company business or not. Like Apple, few companies follow that recommendation, he said. Although Huang had a video game playing on his phone, data records show he may have been texting before that.

“Apple has yet to recognize their own responsibility as an employer,” Sumwalt said. “They have failed to say [to their] 135,000 employees that we care about you, and we don’t want you to go out and kill yourself or others on the roadway. Apple has failed in that respect.”

Advertisement

Apple has not yet responded to a request for comment.

Sumwalt made clear the Mountain View crash was not an isolated incident, but illustrative of the safety issues involved as humans and robot systems increasingly share the driving, not just in Teslas but in vehicles from all manufacturers. “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” he said.

Autopilot is an automated driver-assist feature sold as part of what Tesla calls a “Full Self Driving Capability” package for $7,000 that can speed up, brake and change lanes automatically, although the driver is supposed to pay attention.

Other car companies make similar systems, though none is as technologically aggressive as Tesla’s. And none refers to “self-driving,” which even at Tesla remains an aspirational term. Cars that can fully drive themselves are not being sold by anyone to individual customers today, and most industry experts say it will be years until that happens. Tesla has said the upfront $7,000 buys current features plus self-drive features to be added over time.

According to NTSB investigators, Huang was driving his 2018 Tesla Model X on Autopilot when it sped up from 62 mph to 71 mph and plowed into a damaged safety barrier at the end of a concrete wall. The wall divides a left-hand exit ramp that veers away from Highway 101, known locally as the Bayshore Freeway.

Huang had dropped off his youngest child at day care and was taking his regular commuting route to Apple offices in Sunnyvale. The investigation showed Autopilot engaged for nearly 19 minutes before the crash and that Huang’s hands were off the steering wheel in the last six seconds.

Advertisement

Electric vehicles can cut greenhouse gases and carmakers have big plans for them. But so far, few car buyers want them.

The impact twisted the car counterclockwise and into a freeway commuter lane to the right of the concrete wall. Two other cars collided with the Tesla. The front end of the Tesla was sheared off. The car’s battery burst into flames. Huang was pulled out of the Model X by three men and taken to a hospital by ambulance, where he was pronounced dead.

Two major factors contributed to the severity of the crash. One, with Autopilot in control, the Model X drove straight down the middle of a “gore lane,” a white-striped zone where cars aren’t supposed to go, crashing head-on into a flexible steel “smart cushion” that’s intended to soften the impact of a crash.

Two, the cushion already was severely damaged. After a Toyota Prius crashed into it 11 days earlier, the length of the attenuator was shortened, offering less protection against the 3-foot-tall concrete median wall behind it. The safety device was not repaired by Caltrans until three days after Huang’s death.

The NTSB said the California Highway Patrol failed to report that the safety barrier Huang’s Tesla hit had been damaged by a crash 11 days earlier. If the barrier “had been in repair the driver likely would have survived,” the board said.

Among the NTSB crash-investigation findings:

  • The vision processing on Huang’s car could not maintain an appropriate line of travel and steered the car into a steel safety barrier and a concrete wall.
  • The car’s collision avoidance system did not detect the crash barrier — and it wasn’t even designed to do so.
  • The car’s forward collision warning system did not provide an alert, and the automatic braking system did not activate.
  • Tesla didn’t provide a sufficient means of monitoring the driver’s lack of attention.
Advertisement

The NTSB aimed much of its criticism at the nation’s top highway safety regulator, the National Highway Traffic Safety Administration, which is an arm of the Transportation Department. The NHTSA — which has enforcement power and can recall cars with defective automotive technology — has failed to implement crash prevention recommendations issued by the NTSB, and instead “relies on waiting for problems to occur rather than addressing safety issues proactively,” the board said.

The NHTSA’s Office of Defects Investigation “failed to thoroughly investigate Tesla’s Autopilot design regarding the degree to which drivers are currently misusing the system,” the NTSB said.

Board member Jennifer Homendy said she thinks the NHTSA cares more about business than about safety. “Let me be clear. NHTSA’s mission is not to sell cars,” she said.

A spokesman for the safety regulator said “NHTSA is aware of NTSB’s report and will carefully review it.” He also listed documents where the NHTSA offers “guidance” to driver-assist developers.

Relatives of Huang were in the audience at the meeting. Sumwalt addressed them directly, saying: “Our goal is to learn from what happened so others don’t have to go through what you’re going through.”

The safety board picks crashes to investigate that can advance the knowledge of safety issues. It’s highly selective. There are millions of highway crashes in the U.S. each year. The board is currently investigating 17 crashes, three involving Tesla’s Autopilot technology. The NHTSA said it is probing at least 14 Autopilot-related crashes.

Advertisement

The ramifications of Tuesday’s conclusions are yet to be determined. The NTSB is an independent federal agency, best known for its probes into airline disasters. It lacks enforcement power but its recommendations are considered thorough and are taken seriously by policymakers.

In a preliminary NTSB report on a January 2018 Autopilot-related crash, where a firetruck was rear-ended by a Tesla Model S on the 405 Freeway, the board laid blame on the driver’s inattention, misuse of the Autopilot system, over-reliance on Autopilot, and Autopilot itself, which the NTSB said permits driver disengagement from the driving task. No one was injured in that crash, which the NTSB continues to investigate.

Advertisement