Data collected in “black boxes” may be the tool Tesla Motors can tap into to deflect Autopilot liability, say lawyers familiar with such cases.
Tesla is under pressure to defend itself since the May 7 fatal crash in Florida involving the Autopilot system. With Tesla collecting data from its Model S and Model X vehicles using Autopilot, Tesla could be in a strong position to publicly counter, and possibly legally defend, claims about the safety of its Autopilot driving-assist software.
Data taken from event data recorders (EDRs), often called “black boxes,” record data like speed, seat belt usage, and pedal position in the seconds before and after a crash.
Data recorded from EDRs have become commonly cited in courtrooms as the technology has become standard in new cars over the past decade.
“I’ve had EDRs that have helped me in a case, and EDRs that have basically told me, don’t take that case,” said Don Slavik, a plaintiffs’ attorney who handles automotive product liability cases.
Tesla’s wireless data collection appears to be more extensive than that of many onboard EDRs. Following a July 1 crash in Pennsylvania, Tesla was able to see whether Autopilot was engaged, whether the driver’s hands were detected on the steering wheel, and the amount of force applied to the accelerator, the company said.
A review of court dockets nationwide by Reuters did not reveal any claims filed against Tesla over crashes while Autopilot was engaged. In the event of a lawsuit, though, the company’s information could “be very helpful if it can be validated and verified and has sufficient clarity,” said Slavik.
Tesla spokesperson Khobi Brooklyn declined to comment on the potential use of such data in litigation. The company states in user manuals that it reserves the right to use collected data for its defense in a lawsuit.
Tesla CEO Elon Musk has been tweeting comments to stir public support for the safety of Tesla vehicles equipped with Autopilot. Musk cited EDR data in a July 14 Twitter post in which he said on-board vehicle logs showed Autopilot would have prevented the Pennsylvania crash, but it was turned off at the time.
The courtroom, however, is different from the court of public opinion, said Bryant Walker Smith, an assistant professor of law and engineering at the University of South Carolina, who studies self-driving vehicles.
“Summaries and spin will be much less credible than analysis supported by raw data that others can evaluate,” he said.