Following an early May death of a Model S driver operating the car in Florida in its semi-autonomous Autopilot function, the National Highway Transportation Safety Administration (NHTSA) is opening a preliminary evaluation.
Today Tesla issued a blog post saying it had delivered information to NHTSA following the car’s careering into a tractor trailer which had crossed in front of the car, and learned yesterday that federal regulators were opening a preliminary evaluation.
“It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations,” said Tesla.
The driver’s name was Joshua D. Brown, 40, of Canton, Ohio, and his obituary says he’d been a member of the Navy Seals, with 11 years served. He then started a successful technology company, Nexu Innovations Inc., and is survived by his parents, Warren and Sueanne Brown, of Stow, Ohio; his sister, Amanda Lee (Jeremy Lee); six nieces and nephews; and numerous aunts, uncles and cousins.
An inquiry to NHTSA received a reply from Communications Director Bryan Thomas saying “NHTSA’s Office of Defects Investigation is opening a Preliminary Evaluation of the design and performance of automated driving systems in the Tesla Model S.”
“NHTSA recently learned of a fatal highway crash involving a 2015 Tesla Model S, which, according to the manufacturer, was operating with the vehicle’s ‘Autopilot’ automated driving systems activated. The incident, which occurred on May 7 in Williston, Florida, was reported to NHTSA by Tesla,” said Thomas. “NHTSA deployed its Special Crash Investigations Team to investigate the vehicle and crash scene, and is in communication with the Florida Highway Patrol. Preliminary reports indicate the vehicle crash occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection on a non-controlled access highway. The driver of the Tesla died due to injuries sustained in the crash.”
Tesla in its own words set the narrative of what happened:
Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.
Thomas gave a status report of where things stand with the federal regulators.
“NHTSA’s Office of Defects Investigation will examine the design and performance of the automated driving systems in use at the time of the crash. During the Preliminary Evaluation, NHTSA will gather additional data regarding this incident and other information regarding the automated driving system,” said Thomas. “The opening of the Preliminary Evaluation should not be construed as a finding that the Office of Defects Investigation believes there is either a presence or absence of a defect in the subject vehicles.”
Tesla’s blog post opens by saying that Tesla’s miles driven between fatal crashes exceeds that of the average.
“This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles,” it said. “Worldwide, there is a fatality approximately every 60 million miles.”
In other quarters, the Autopilot function has stirred mixed imprssions, with some including an employee of Volvo saying Tesla was irresponsible for launching a beta example of this tech upon which lives are dependent.
“It gives you the impression that it’s doing more than it is,” said Trent Victor, senior technical leader of crash avoidance at Volvo in an interview with The Verge, “[Tesla’s Autopilot] is more of an unsupervised wannabe.”
And whether that is exactly the case with Brown, at least certain is he posted videos of himself riding in autopilot mode.
“The car’s doing it all itself,’’ he said in one (below), smiling with hands off the wheel. Another video had him crediting the system for saving him from an accident.
Of concern is the vehicle may follow the roadway, but numerous other YouTube videos of near misses and other anecdotes have come forth showing this is not a hands-free system.
And to be sure, Tesla has repeatedly said it is not that. The driver is supposed to remain in control with hands on the wheel.
A video compiled from random YouTube clips shows the system did not meet expectations held in the mind of the drivers, as well as spots where near misses were avoided avoided thanks to Autopilot.
Where things may become tricky is Autopilot is good enough to lull drivers into hands-free driving, and indeed it can drive hands free under many situations, which may create a false sense of security or a lessening of vigilance.
Tesla addressed some of this in its blog post.
It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.
Tesla followed up with more:
We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.
The company expressed condolences in its blog post for the customer who died.