Tesla's 'autopilot mode' puts it at risk for liability in crashes - Los Angeles Times
Advertisement

Tesla’s ‘autopilot mode’ puts it at risk for liability in crashes

Share via

By rolling out self-driving technology to consumers more aggressively than its competitors, Tesla Motors secured a spot in the forefront of a coming industry.

But that strategy could expose the company to a risk it has sought to avoid: liability in crashes.

Tesla in 2015 activated its autopilot mode, which automates steering, braking and lane switching. Tesla asserts the technology doesn’t shift blame for accidents from the driver to the company.

Advertisement

But Google, Zoox and other firms seeking to develop autonomous driving software say it’s dangerous to expect people in the driver’s seat to exercise any responsibility. Drivers get lulled into acting like passengers after a few minutes of the car doing most of the work, the companies say, so relying on them to suddenly brake when their cars fail to spot a hazard isn’t a safe bet.

Such a concern could undermine Tesla, whose autopilot feature is central to a fatal-accident investigation launched last week by federal regulators.

The National Highway Traffic Safety Administration is considering the role played by autopilot technology in a Florida collision between a Tesla Model S and a big rig. Tesla said autopilot sensors failed to detect the white truck, turning in front of a Model S, against a bright May sky, killing 40-year-old Joshua Brown.

Advertisement

Were the victim’s family to sue Tesla over an accident caused -- or not avoided -- by autopilot, one of several arguments they might make is that Tesla acted negligently by not doing what a reasonable manufacturer would do, said Stephen Nichols, an attorney in the Los Angeles office of law firm Polsinelli. The fact that others have developed similar technology, but have chosen not to release it or have branded it in ways that don’t suggest automation, could leave Tesla vulnerable.

“You could say, ‘Tesla, you’re not doing what these other companies are doing, so you’re being unreasonable,’†Nichols said.

Cases about defective product design typically hinge on whether a company sufficiently vetted its wares – in this situation, programming code that interacts with a number of components throughout the car.

Advertisement

The onus is on the pilot to make sure the autopilot is doing the right thing.

— Elon Musk, founder of Tesla

If the accident happened because the software was inadequate (because it couldn’t spot the white vehicle on a light backdrop) and proper testing would have found the flaw, Tesla could be on the hook, said Jon Tisdale, a general partner in Gilbert, Kelly, Crowley & Jennett’s Los Angeles office.

The competitive landscape bolsters his contention.

“There’s going to be the argument made that they are rushing to market to corner it before other manufacturers release the product, and that Tesla cut the testing short – ‘they didn’t do it right,’†said Tisdale, who mostly defends product liability cases.

Tesla’s billionaire founder Elon Musk has said that autopilot mode is a voluntary feature, that drivers are warned of the risks and that testing it with the public makes it safer than if the company were to do it solely internally. And he’s made clear since its release that drivers don’t abdicate responsibility.

“The onus is on the pilot to make sure the autopilot is doing the right thing,†he said in a televised interview in 2013. “We’re not yet at the stage where you can go to sleep and wake up at your destination. We would have called it autonomous … if that were the case.â€

Consumer advocates say Tesla and other companies that insist on consumer culpability when a machine is in charge don’t understand what’s happening on the roads.

Advertisement

“On the one hand, they’re saying trust us, we can drive better than you would, but on the other hand, they are saying if something goes wrong, don’t ask us to stand behind our product,†said Rosemary Shahan, president of the Consumers for Auto Reliability and Safety lobbying group. “But if it’s controlled by an algorithm, why should you be liable?â€

Google, with its goal of producing a car that doesn’t have a way for a human to take control, is one of the few companies with a different stance at the outset. The company says it would be responsible for accidents caused by its software (though how traffic tickets get handled is still unsettled).

Zoox, a Silicon Valley start-up that recently raised $200 million at a $1-billion valuation from investors, declined to comment about how it views the liability question. But the company also isn’t planning to release technology that would require human intervention.

Shahan said holding companies accountable through lawsuits and regulation might stifle innovation, but it’s a worthwhile tradeoff to get them to take more precautions.

“It’s hard enough to not nod off when you are in control, let alone when you’re in autopilot,†she said. “We shouldn’t trade one set of human error for another.â€

Brown’s family has said through attorneys that they hope lessons from his crash “will trigger further innovation which enhances the safety of everyone on the roadways.â€

A decision on whether to file a lawsuit isn’t likely until the federal inquiry is completed, and the family’s focus remains on mourning, the attorneys said.

Advertisement

[email protected]

Twitter: @peard33

ALSO

Many Tesla drivers are happy to help company test its ‘beta’ autopilot feature

Tesla and Google are both driving toward autonomous vehicles. Which is taking the better route?

Elon Musk’s Tesla-SolarCity deal makes a lot of sense — but only for Elon Musk

Advertisement
Advertisement