Tesla Autopilot — man, that’s a topic for a John Oliver skit, isn’t it?
We all knew that, sooner or later, someone would die while driving a Tesla with Autopilot on. The technology isn’t a forcefield that can protect drivers from every dangerous scenario. Even if it was “perfect” at seeing and sensing everything around it, there would be cases where it couldn’t prevent an accident, or a death. (Because physics … + the genuine fragility of human life.)
Nonetheless, if you focus on the individual in a tragic death, it is hard to not be affected. The first (so far, only) death that occurred from an accident in which Tesla Autopilot was in use resulted in ~100 spotlights shining on the new tech and the Silicon Valley company behind it. People were personally hit by the sad death and reacted in an emotional way, as we knew they would. Unfortunately, I was a bit surprised by how far some of the reactions went.
As we noted in our first article about the matter, there aren’t enough accidents (especially deadly accidents) and miles logged with Autopilot in use to make statistically significant conclusions about the superiority (or not) of Autopilot driving versus normal driving — hence the reason Tesla says Autopilot is in “beta.”
But we do have some numbers to work with. The deadly accident came after 130 million miles of Autopilot driving, and Tesla noted in its initial press release about the accident that a fatality occurs, on average, every 94 million miles in the United States and every 60 million miles worldwide. Again, that doesn’t yet provide us with statistically significant evidence that Autopilot is safer, but it is a strong sign leaning in Autopilot’s favor.
Nonetheless, words can be more powerful than numbers, and many media outlets, commenters, industry insiders, and surely people shorting the stock created a somewhat hysterical atmosphere around the technology following Joshua Brown’s death. Somehow, it didn’t matter that Autopilot drivers had gone 36 million miles beyond the average before a death occurred, or even 70 million miles compared to the worldwide average. Either the math was too hard, the numbers were not seen, or people just let emotions override rational evaluation.
When “Concern” Turns Into Lack Of Concern
Even Consumer Reports (which many people trust for objective, numbers-based advice) oddly recommended that Tesla pull the technology. (To be honest, I think Consumer Reports took a bit of flack from Tesla haters when it gave the Tesla Model S P85D a rating that essentially broke its rating system, and I think it has been working too hard to present a “we don’t love Tesla too much” image in response. I’m not going to say that is why it made such a strong anti-Autopilot recommendation, but I believe it fed into that.)
Luckily (for many, many reasons), Elon Musk likes math, and he routinely bucks the corporate CEO trend by not bowing to media or investor pressure in a way that would compromise the benefits of Tesla products and Elon’s overall goal to help humanity as much as possible … when not playing video games (though, I actually have an article planned on how video games are feeding into Tesla’s EV battery leadership).
Regulators have already determined that Autopilot doesn’t present a safety risk, but hey, why listen to them when the media is on a fear-hyping rampage? Honestly, the regulators are probably well aware that Autopilot is a huge step forward, is safer to use than not use, and is leading us to much safer technology.
There have been numerous stories of accidents being prevented by Tesla Autopilot, and several occasions in which the drivers think that Autopilot saved their lives, as well as the lives of others. If you obsessively follow the Tesla Motors Club forums (like I do), you have seen plenty such anecdotes. Tesla surely receives others that it doesn’t report. Here’s one that it did share following the media’s Autopilot shenanigans:
Autopilot prevents serious injury or death of a pedestrian in NY (owner anecdote confirmed by vehicle logs) pic.twitter.com/NceuqckqCK
— Elon Musk (@elonmusk) July 21, 2016
More recently, we have the interesting case of Joshua Neally. He thinks Autopilot might have saved his life — not by preventing a specific accident, but by getting him to the emergency room following a pulmonary embolism. If he had been in a normal car, hadn’t gotten treatment quickly enough, and had died, his death wouldn’t have registered as a traffic accident, but it is potentially another death that Autopilot prevented.
However, it doesn’t really matter to some people how many deaths Autopilot prevents — it is hard to count things that don’t happen, while it is easy to obsess over things that do happen. Even if Tesla Autopilot has saved 10 lives, the fact that it was on while 1 person died has spooked some people enough (I’m not quite sure how, to be honest) that they have called for Tesla to eliminate the option.
In short, I would say that narrow-minded, subjective concern has led people to a lack of concern for the people Autopilot has helped, and the lives it could save in the future. It’s nonsensical.
Tesla Owners Say “No Way!”
It’s worth pointing out that Tesla owners are on Elon’s & Autopilot’s side. (I wonder why….) Reportedly, in a survey of owners, a whopping 0% wanted Tesla to drop Autopilot.
Tesla customers are v smart & don't want media speaking on their behalf abt Autopilot. Recent poll: 0.0% want it disabled — not 0.1%, 0.0%.
— Elon Musk (@elonmusk) July 17, 2016
But the matter is sort of pointless and ridiculous anyway, imho. Autopilot is an option! No one is forced to use it, even if they buy a Tesla. Buyers actually have to pay more ($2,500–3,000 more) to get Autopilot, and then they choose every time they drive when to use it or when to leave it off.
Why should Tesla get rid of a feature that is optional, strong evidence shows is probably safer, and several owners have attributed with saving their lives … let alone making their commutes much less stressful!
“Autopilot” vs “Self-Driving Car”
Note that it is always on the responsibility of the driver to stay alert and avoid potential accidents. Elon has stated several times now that Tesla used the term “Autopilot” in reference to airplane autopilot … which I think we all know is support tech for a human pilot who still has to remain alert and is in charge of the plane and the safety of passengers.
Somehow, people have taken offense at the term, as they see it as leading people to believe that the car can drive itself while you sleep or watch movies or something. Again, remember that the Tesla owner poll had 0% of respondents wanting Autopilot removed. We have certainly seen some people use Autopilot in an inappropriate fashion … because it’s so good. However, Tesla tells owners many times in many ways that they must remain alert when using Autopilot, the tech is not perfect, and they are responsible for remaining safe.
Tesla didn’t call its suite of Autopilot features “self-driving car tech” or “driverless tech that will protect you from The Bermuda Triangle if you happen to be driving in the Atlantic Ocean.”
As with concerns over the use of the term “beta,” it’s a bit ironic that people are concerned about Tesla terminology that was specifically used to indicate to buyers that the tech didn’t turn on a forcefield or make it unnecessary for drivers to pay attention and be ready to take over at any time.
The 2nd Death
What’s the point of all this now? Well, the point is that a second death with Autopilot turned on will eventually occur. It is inevitable. When that happens, I hope this sprinkles enough turmeric into the media meal that it prevents some people from overreacting, letting emotions override logic, and forgetting or ignoring that the numbers are on Autopilot’s side.
Let’s not try to slow adoption of a technology that makes our roads safer, eh?
Don't want to miss a cleantech story? Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.