People die while driving Tesla automobiles, even if they are using Autopilot, the company’s enhanced level 2 driver-assist system. And every time it happens, there is a great hue and cry about how dangerous Autopilot is, completely ignoring that hundreds if not thousands of highway deaths have occurred to people driving ordinary cars in the same period of time.
The same thing happens when an electric car battery catches fire. Suddenly, there are mobs with torches and pitchforks ready to storm the gates demanding a ban on electric cars, totally ignoring that there are hundreds of gasoline fires every day.
And while every highway fatality is regrettable, statistics still show that driving with Autopilot engaged results in a lower incidence of fatal accidents than driving without it. So perhaps it is time to admit that Autopilot is a work in progress that is not yet perfected but that can help people to drive more safely (if used properly). At the same time, there are those who think Tesla has pushed the technology too far, too fast. The National Transportation Safety Board agrees with that take.
New NTSB Report
The National Transportation Safety Board (NTSB) has completed its investigation into two recent crashes that involved the use of Autopilot — one near San Francisco and the other near Delray Beach in Florida. According to ABC News, in its findings, it blames the drivers, Autopilot, and the National Highway Traffic Safety administration in equal measure. The drivers were obviously not paying attention to the road, so they get a share of the blame, but the NTSB report said the design of the Autopilot system contributed to the crash because it allowed the drivers to operate their vehicles without due care, so Tesla gets a slice of the blame as well.
Then the NTSB took NHTSA to task for failing to follow up on safety recommendations it made previously. Specifically, it did not make sure automakers put safeguards in place to limit use of electronic driving systems like Autopilot to areas where they are designed to work. In particular, it says Autopilot isn’t designed for roads with cross traffic, yet Tesla allows drivers to use it under those circumstances anyway. At a hearing last month in California, NTSB members expressed frustration that safety recommendations from previous Tesla Autopilot crashes had been ignored by Tesla and that NHTSA has not taken action on its own recommendations.
That’s where Tesla supporters and Tesla detractors part company. NTSB says Tesla shouldn’t allow Autopilot to be engaged on roads and in traffic conditions it wasn’t designed for. Tesla disagrees, saying drivers are ultimately responsible and are given plenty of warnings to encourage them to use the technology safely.
So we are back to the same “he said, she said” that has been at the heart of the Autopilot dialogue since it first became available. Elon and his acolytes argue people should be expected to act rationally. Others believe sensible regulations should be promulgated and enforced because lots of times people do not act rationally. My old Irish grandfather always said the most dangerous part of an automobile is the nut behind the wheel.
It is an argument that is unlikely to reach a consensus any time soon.
Delray Beach Details
In the Delray Beach accident, data from the car — a Tesla Model 3 — showed it was travelling at 69 mph at 6:17 am. A tractor trailer crossed the road and the Tesla slammed into it, shearing off the roof and killing the driver. Neither Autopilot nor the driver applied the brakes or attempted to avoid the collision. The truck driver was cited for unsafe operation.
Here’s more from ABC: The driver turned on the car’s adaptive cruise control system 12.3 seconds before impact. Autosteer was turned on 2.4 seconds later. No pressure was detected on the steering wheel in the 7.7 seconds before the crash. Tesla told the NTSB the driver wasn’t warned about not having his hands on the wheel “because the approximate 8-second duration was too short to trigger a warning under the circumstances.”
Tesla told the NTSB that forward collision warning and automatic emergency braking systems on the Model 3 in the Delray Beach crash weren’t designed to activate for crossing traffic or to prevent crashes at high speeds. “According to Tesla, the Autopilot vision system did not consistently detect and track the truck as an object or threat as it crossed the path of the car,” the report said.
“The Delray Beach investigation marks the third fatal vehicle crash we have investigated where a driver’s overreliance on Tesla’s Autopilot and the operational design of Tesla’s Autopilot have led to tragic consequences,” NTSB Chairman Robert Sumwalt said in a statement.
Edward Markey, a US Senator from Massachusetts, is one person who finds Tesla’s Autopilot not up to snuff, and has been roundly criticized for his position (ahem). On his website, he says, “Autopilot is a flawed system, but I believe its dangers can be overcome. I have been proud to work with Tesla on advancing cleaner, more sustainable transportation technologies. But these achievements should not come at the expense of safety.
“That’s why I’m calling on Tesla to use its resources and expertise to better protect drivers, passengers, pedestrians, and all other users of the road. I urge Tesla to adopt my commonsense recommendations for fixing Autopilot, which include rebranding and remarketing the system to reduce misuse, as well as building backup driver monitoring tools that will make sure no one falls asleep at the wheel. Tesla can and must do more to guarantee the safety of its technology.”
And so the debate goes on. Should federal and state regulators promulgate rules for where and when certain driver-assist systems can be used, or should manufacturers be allowed to make those decisions? It’s a question with no fully satisfactory answer. But with NTSB calling out NHTSA publicly on the subject and at least one US Senator weighing in, the possibility of new regulations seems more likely now than before this latest NTSB report.
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Former Tesla Battery Expert Leading Lyten Into New Lithium-Sulfur Battery Era — Podcast:
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...