I came across a video on YouTube recently that has been making its way around the Tesla critics I follow on Twitter. For the Tesla fans wondering why I do this, keep in mind that as a writer for CleanTechnica, it’s my job to keep my finger on the pulse of clean energy industries, and I wouldn’t be able to provide honest writing if I always took one side and wasn’t following the different perspectives out there. The truth is, even among “TSLAQ,” there are some good critics who have useful information at times.
In other words, you don’t have to agree with people on everything to learn something from them, and it’s important that we take the good from things like this video while also learning from what’s not so good. After I’ve gone through the video, I hope we can have a wider discussion about both the things that are right and the things that are wrong with this video.
Below is the video. It’s almost an hour long, but if you don’t feel like spending that much time on it, feel free to see my synopsis and discussion further down.
A Quick Summary
For those who don’t want to watch an hour-long video, I’m going to hit the major points here and then discuss the good and the bad information in the video.
The video starts by giving the definition of autonowashing, a concern I’ve raised in the past. Autonowashing was a term coined by Liza Dixon, and it refers to the overstatement of autonomous capabilities by a manufacturer or its fans. This is a problem because people who misunderstand a system’s capabilities gain too much trust in the system, and that sometimes leads to tragedy. Nobody wants that.
The video then goes through a montage of examples of Elon Musk and Tesla overstating the capabilities of Autopilot, and shows people abusing the system. Basically, the rest of the video follows up on this idea, that overstatement of capability leads to abuse, and then tragedy.
He then gives viewers background on Tesla, “Elon Time,” what autonowashing is in more depth, ways that Tesla does this, and how it’s hurting people.
First off, the video did praise Tesla a lot and give credit where credit was due. Tesla popularized electric cars, keeping them from getting killed off. Tesla made EVs cool, not just something an environmentalist would find desirable (like the second generation Prius everyone flew the bird at).
Raising the autonowashing concern was important, and necessary. Miscalibrated trust (in this case, too much trust for a system that isn’t ready for that much trust) matters because it can encourage people to do dumb things with Autopilot. As fans, we should always be reminding people how to be safe so they don’t bring public distrust and regulators down on Tesla.
Questioning Tesla’s framing of data is also a good thing to do. While he goes pretty harsh on that, it is true that Autopilot is mostly used on roads that are 2–3 times safer than driving is overall (highways). Other concerns, like including international data, are valid. He goes into much more detail, and I’d like to see Tesla defend its data against these criticisms so we can further evaluate them.
These important points would be a lot stronger if other baggage wasn’t running people off, especially Tesla fans who could possibly do the most good with these facts and questions.
The biggest problem I have with the video is that it basically runs on the “if it saves one life” fallacy. Sure, doing something different in society might save a small number of lives, but at what cost? I know it sounds cold and heartless to put a price on human life, but we do that all the time. We put prices on our own life’s future worth when we buy life insurance, even though we know we’re priceless to our loved ones. When a road is built, it’s known that a certain number of people will die on that stretch of road over time, but governments have to balance cost and roadway efficiency with lives or face an angry public that doesn’t want to be inconvenienced (or even killed in a freak occurrence).
Put more succinctly, it’s impossible to get almost anything in life to zero deaths. Ladders, fast food, and even the wrong color changes in television programs can get people killed. While we should try to reduce deaths where we reasonably can, nobody would be willing to live in a world where we pursued absolute safety at the expense of every other consideration.
The rest of the problems I had with this video were basically all a matter of misplaced criticism.
One of the things that bothered me about the video was how much its maker focused on Elon Musk, even taking a meme of Elon down during the video to show how hurt he feels. The truth is that Tesla’s issues aren’t something that can all be put on Elon Musk, and there’s a lot of misuse possible with driver assist systems from other manufacturers, too. Tesla Autopilot is not unique in that regard. To make this all about Elon Musk really weakens the good arguments the video makes by making the whole video look like an ad hominem attack on personality.
It’s fine to argue over whether Elon Musk was really a founder of Tesla if that’s what you want to do, but to throw that into the video seemed not only off topic, but makes me think that he has it out for Elon Musk and couldn’t resist.
While he did point out that “Elon Time” is aspirational and runs against great challenges that make it hard to predict timelines, he seems to only use this to make Elon Musk seem irresponsible, or that he’s intentionally using it to autonowash.
Another problem I came across was that he criticized first Tesla’s lack of a driver monitoring system, and then its current implementation of driver monitoring. As I pointed out a few paragraphs ago, other manufacturers have systems that use little or no driver monitoring. Rather than ask whether Elon Musk is killing people, maybe we should be asking whether the industry at large is killing people.
Instead of focusing on Elon Musk and Tesla, it would have made for a much stronger and more helpful video to show how Tesla’s system sits at the center of a wider world of ADAS system misuse.
Featured image by Tesla.