Tesla has produced well over a million cars. Some of those cars get into accidents. As it turns out, a much smaller percentage of them get into accidents each year than cars in the overall US fleet do. Though, without a doubt, the attention Tesla accidents get from national and international press is orders of magnitude higher than the attention given to accidents from any other brand vehicle. I’ll come back to the misinformation swirling around the latest accident in a moment. First, though, I want to try to explain something that somehow goes over the heads of so many Tesla critics.
The daily calls to ban Tesla Autopilot are mind-bogglingly stupid. Tesla Autopilot is essentially just an advanced version of cruise control. Old-school cruise control let you set a speed and drive that speed without putting your food on the pedal. Cool stuff. As tech in this realm improved and evolved, we also got adaptive cruise control — which would slow down the car if cars in front of it slowed down. Cool beans. Then leaders in the market started implementing systems that would also keep the car in the lane. (Though, many early versions ended up ping ponging the car from one lane marking to the other and back so much that the systems were more annoying than useful.)
What does Tesla Autopilot do now? For the hundreds of thousands of owners who bought an advanced Autopilot package, you get a superb version of everything above (no ping-ponging) as well as the ability to initiate automatic lane changes and have the car automatically go from on-ramp to off-ramp on the Interstate — with your supervision. Tesla Autopilot is just another step forward in this long evolution of driver-assist technology, and I have never met a Tesla driver who thought otherwise.
The current advanced-Autopilot package is called “Full Self-Driving” because firmware updates are expected to eventually create a system that can fully drive the car without human intervention from any Point A to any Point B. No new hardware is supposed to be needed, just firmware updates. I’m quite certain everyone who owns a Tesla knows that the car cannot drive itself from Point A to Point B without human supervision — and even everyone who doesn’t have a Tesla should know that too, imho). Also, there is no button to turn on “Full Self-Driving” (FSD) in the car. I have the FSD package, but if I want to use any of the driver-assist (advanced cruise control) features, I have to initiate “Autopilot” — it is the name of the in-car application.
So, let me just reiterate this: Tesla Autopilot is simply an advanced version of old-school cruise control — and I think that everyone driving a Tesla is aware of this.
Somehow, despite this fact, some people think Tesla shouldn’t be allowed to include automated driving features in its cars. (Only other brands should be allowed to include such features — inferior versions of them — in their cars.) Unfortunately, it’s not just wackadoodles making this argument. Actually serious (and verified!) business and transportation journalists are!
Whether NHTSA gets tough on Tesla's marketing and use of automated systems, which NTSB has said is dangerous, remains one of the biggest questions hanging over Bidens DOT https://t.co/uxc7VW3ld2
— Sam Mintz (@samjmintz) April 18, 2021
Goodness gracious. This brings such a bad look to such an important profession. Democracy dies without good journalism, and the field has been under enormous attack in the past several years, weakening one of the bedrocks of US democracy, but it’s a special kind of shame to see people in the field hurting the profession’s reputation by making illogical arguments.
Aside from everything I wrote above, the big hubbub about this unfortunately deadly accident in Spring, Texas, seems to be a bit misinformed. Apparently, this accident happened on an unmarked residential street. You cannot turn Autopilot on on an unmarked residential street. It will chime very loudly at you and tell you that you can’t turn on Autopilot. I just confirmed this again myself on a residential road on the way to the grocery store where the car showed that it could identify where the center of the road was but would not turn on Autopilot because of a lack of lane markings. Tesla owner Alan Dail explains it well:
And here’s a video demonstration of the point:
No lane lines, no Autopilot.
It’s that simple.
— Whole Mars Catalog (@WholeMarsBlog) April 18, 2021
Edit/update: Yes, there are instances where you can get Autopilot to engage if it is tricked into thinking something is a lane marking. That’s extra unlikely at night due to worse visibility, but it is technically possible. If someone was really gung-ho about tricking the car, for example, they could draw lane markings on a street with chalk. Again, this seems highly unlikely in this scenario unless the people involved were dedicated to tricking the car in multiple ways.
Great video of FSD when you unbuckle your seatbelt!
Spoiler: It immediately sounds alarms and pulls the car over straight away!!!!!
Thanks Brandon. https://t.co/HItuQ5ubOQ
— TeslaFruit (@TeslaFruit) April 19, 2021
So, yes, it’s unfortunate that two men died in a Tesla on a residential street after a seemingly horrible accident, but it had no connection to Autopilot or the FSD Beta that very few people have access to. More likely than not, the driver decided to accelerate at a dangerous level and drove off the road very quickly into a tree. As sad as that is, it happens with all kinds of vehicles every day in rather large volumes. Even more so, guys driving performance cars have a higher tendency to do this.
Why wasn’t someone in the driver’s seat? I don’t know, but they could have been thrown out of the seat if they didn’t have a seatbelt on, or perhaps they moved from one seat to another for some reason.
Why are some people with blue checkmarks on Twitter calling for Tesla Autopilot and/or FSD to be banned in response to this news? Perhaps because they’ve been obsessively commenting on Tesla in this way for years and they can’t get over the fact that they are wrong about this topic? Or perhaps they’ve honestly never learned what Autopilot is and what it isn’t, what it can do and what it can’t, and how Tesla communicates all of that to Tesla drivers. Perhaps they are obsessed with the idea that Tesla should fail, or with the idea that it should be held to different standards simply because its tech is popular.
Tesla offers the most advanced version of cruise control on the market. The term “Autopilot” is used because just like in a plane using autopilot, the human in control gets some assistance from the automation systems onboard, and Elon likes things that fly and used to fly a lot himself. “Full Self-Driving” is a package you can add to your Tesla for $10,000. I don’t think people pay $10,000 for something without finding out what it does and what it doesn’t do. It is well known — especially by people paying $10,000 for the product — that the firmware is supposed to be improved routinely until the point where your car can drive itself without supervision, and people who paid several thousand dollars for FSD are eagerly waiting for the firmware update that makes that happen, and that also allows you to send your car out as a robotaxi. If you bought FSD, every time there’s an over-the-air updated, you read the update notes to see how your car has improved. You don’t just assume that your car can suddenly drive itself just because Elon Musk tweeted something about tech that’s in development. And, no, no one is assuming they turned robotaxi-level Full Self Driving on and then accelerating quickly into a tree.
I wish I could say this is the last time I’m going to respond to stupid statements from relatively high-profile people about Tesla driver-assist features and why they should supposedly be banned, or renamed Boring Pilot or something. Unfortunately, there’s a clear and strong push by certain people to try to get the Biden administration to cripple Tesla cars and bring them down to the level of inferior products that are less safe. So, I feel compelled to respond rather than enjoyably writing about more fun topics.
By the way, CNN or CNBC or MSNBC or Fox News, if you’re going to cover this or related stories, I’d be happy to come on to talk about them. I’ve been covering Tesla professionally since 2012, have co-owned a 2015 Tesla Model S with Tesla’s first-generation Autopilot, and now own a Tesla Model 3 with the Full Self-Driving package.
According to the media, For most cars, it’s your job to drive responsibly.
Unless you buy a Tesla -then you need to find ways to trick all of the safety features. If Tesla can’t stop you, it’s their fault.
— 🐶Earl of FrunkPuppy🐶 (@28delayslater) April 19, 2021