How does Tesla’s autopilot feature react to crossing traffic? Feds want to know
Federal investigators looking into Tesla Motors Inc.’s autopilot system after a fatal crash in Florida are zeroing in on the limitations of the feature and how it reacts when obstacles cross its path.
The National Highway Traffic Safety Administration on Tuesday posted a nine-page letter seeking information from the electric car maker about autopilot — which Tesla has said is an “assist feature,†not a substitute for an alert driver — and about why the feature failed to detect a tractor-trailer that crossed in front of a Model S sedan May 7 in Florida.
Much of the letter seeks information on how the system works at intersections with crossing traffic, but it also asks Tesla to describe how the system detects “compromised or degraded†signals from cameras and other sensors and how such problems are communicated to drivers.
The crash in Williston, Fla., killed the Tesla car’s driver, former Navy Seal Joshua Brown, 40, of Canton, Ohio. Tesla, which collects data from its cars wirelessly, says the cameras on Brown’s Model S sedan failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and the car didn’t automatically brake.
The safety agency also asked Tesla for its reconstruction of the Brown crash, and for details of all known crashes, consumer complaints and lawsuits filed or settled because the autopilot system didn’t brake as expected.
NHTSA said Tesla must comply with its request by Aug. 26 or face penalties of up to $21,000 per day, to a maximum of $105 million.
A spokesman said the agency hasn’t determined if a safety defect exists with autopilot. The information request is a routine step in an investigation, spokesman Bryan Thomas said.
Tesla’s autopilot system uses cameras, radar and computers to detect objects and automatically brakes if the car is about to hit something. It also can steer the car to keep it centered in its lane. The company says that before autopilot can be used, drivers must acknowledge that it’s an “assist feature†that requires both hands on the wheel at all times. Drivers also must be prepared to take over at any time, Tesla has said.
Tesla released autopilot last fall. Some safety advocates have questioned whether the company — which says the system is still in “beta†phase, a computer industry term for software testing by customers — and NHTSA allowed the public access to the system too soon.
“No safety-significant system should ever use consumers as test drivers on the highways,†said Clarence Ditlow, head of the nonprofit Center for Automotive Safety. He said NHTSA lacks the electronic engineers and laboratories needed to keep up with advanced technology.
Tesla says that autopilot has been safely used in more than 100 million miles of driving by customers and that data show drivers who use autopilot are safer than those who don’t.
NHTSA’s Thomas said he wouldn’t comment on specifics of the investigation. The agency does not currently have legal authority to prevent automakers from rolling out features if they meet basic federal motor vehicle safety standards. It is in the process of developing standards for self-driving cars.
The investigation, opened June 28, could have broad implications for the auto industry and its path toward self-driving cars. If the probe finds defects with Tesla’s system, the agency could seek a recall. Other automakers have or are developing similar systems that may need to be changed because of the probe, which also could affect self-driving car regulations to be unveiled this summer.
In the letter, NHTSA also asked Tesla for details on any modification to the autopilot system that Tesla has made.
ALSO
Column: Now on Starbucks’ menu: Less health coverage
Pokemon Go’s success spells a big win for augmented reality
Relax, you’re not going to jail for sharing your Netflix password
UPDATES:
4:44 p.m.: This article was updated throughout with additional information.
This article was originally published at 9:40 a.m.