Hey, NTSB, Where’s The Evidence Tesla’s FSD Beta Is Dangerous?



Last week, it was revealed that the National Transportation Safety Board (NTSB) was asking the National Highway Traffic Safety Administration (NHTSA) to crack down on ADAS systems, specifically calling Tesla out for the supposed dangers of the FSD Beta program. The letter (like many other Tesla FSD criticisms) is bogus.

I was going to get to this faster, but I felt that we needed a primer for readers on an underlying problem in the automotive safety advocates community. I call it The Church of Automotive Safety. There are many safety advocates that demand we accept the things they say on faith alone, when there’s really no scientific basis for what they’re saying. When we’ve taken their word for things, we’ve ended up with needless deaths, as detailed in the series.

The most recent epistle sent from one government agency to another is yet another example of this. They did give good examples of actual bad handling of automated vehicle testing for a company operating in Las Vegas and for Uber’s mistakes. In Las Vegas, Keolis North America’s automated people mover was supposed to have an attendant ready to take control at all times, and they didn’t actually have access to those controls. In Uber’s case, their inattentive test driver actually caused the death of a woman.

When the epistle got to Tesla, they used a very different standard, though:

“Tesla recently released a beta version of its Level 2 Autopilot system, described as having full self-driving capability. By releasing the system, Tesla is testing on public roads a highly automated AV technology but with limited oversight or reporting requirements. Although Tesla includes a disclaimer that ‘currently enabled features require active driver supervision and do not make the vehicle autonomous,’ NHTSA’s hands-off approach to oversight of AV testing poses a potential risk to motorists and other road users.” (emphasis added)

While other companies testing AVs get called out for actual accidents that occurred, Tesla is being judged based on a “potential risk,” and that means that NHTSA needs to crack down.

To be fair, there have been some deaths and accidents for people using Autopilot, but in every case the system was being severely misused. Some of the deceased even left the driver’s seat completely, despite knowing that the system wasn’t meant to be an autonomous vehicle.

Tesla has been a lot more careful with the testing of its FSD Beta, though. The experimental software was released at first only to a few testers with good driving records, with no more than 1000 people using it. Recently, Elon Musk revealed that some people lost access to the Beta for not paying attention.

On top of that, the company has even been using the car’s interior camera to monitor driver attention, and this is something safety advocates have been pushing Tesla to do.

Perhaps the most important thing to consider is that, despite much handwringing from The Church of Automotive Safety (I was stupid and did this myself a bit), there have been no accidents that we’re aware of using the FSD Beta. None. Not only does Elon Musk say there haven’t been any accidents among Beta testers, but there are also no media reports of such an accident, and you know they’d be all over it if there was one.

There’s one other criticism I’ve seen raised in the last couple of weeks that I’d like to address before moving on: the idea that Tesla is being dishonest when it tells customers that FSD will eventually be autonomous while telling regulators that the FSD Beta will always be Level 2.

Notice the difference there: “Beta. Tesla never told anyone that the FSD Beta is actually a full self-driving autonomous system. It’s a BETA. It’s being tested. It requires somebody to watch over it, because it’s an incomplete software package that everyone knows isn’t ready for prime time. Of course the Beta will always be Level 2! Nobody at Tesla ever said otherwise.

We can argue all day about whether Tesla will ever achieve Full Self Driving, but that’s a different argument altogether and it’s dishonest to conflate statements about the final product with statements made about the unfinished software.

Is It Wise To Regulate Based Only On “Potential Risks?”

The short answer: No.

As I’ve pointed out repeatedly before, the “Safety First” myth is hot garbage. Getting absolute safety is never possible, and we can actually make ourselves less safe when we attempt to go for 100% safety. The only thing we can say for certain (at least with current technology) is that 100% of people will die at some point. Nobody gets out alive. That doesn’t mean we should be careless and take no precautions (Tesla is taking precautions), but we can’t get rid of all risk without everyone starving or a stagnant, repressive society.

Safety comes third. Getting things done and making money (or having fun) comes first. We willingly engage in risky behavior like driving, mountain biking, and even walking. Risk is a normal part of life, and we shouldn’t pretend otherwise.

When we pretend that we’ve got rid of all risks, people stop taking personal responsibility, and common sense goes out the window. Even when you have a “walk” sign lit in front of you, it’s still a good idea to look both ways real quick before walking out into the street, because rule following (don’t walk when it says Don’t Walk) can help with safety, but safety isn’t guaranteed by following the rules alone.

No matter how many rules NTSB gets NHTSA to apply to autonomous vehicle testing, common sense will always be required on the part of test drivers. Even if every test driver must go through a class, if the test driver completely abandons their duty (and, say, watches reality TV like the Uber driver), then bad things will happen. No amount of training can keep someone from doing something stupid — that’s always going to be a personal responsibility.

Tesla’s FSD Beta program is working, and the proof is in the pudding. Test drivers are monitored in some fashion using the interior camera, and inattentive drivers are removed. The remaining drivers have been in zero accidents while using the software, so Tesla must be doing something right. To come down on them now would make no sense.

Featured mage: Autopilot marketing imagery by Tesla.


Sign up for CleanTechnica's Weekly Substack for Zach and Scott's in-depth analyses and high level summaries, sign up for our daily newsletter, and follow us on Google News!
Whether you have solar power or not, please complete our latest solar power survey.

Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one on top stories of the week if daily is too frequent.
Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

CleanTechnica's Comment Policy


Jennifer Sensiba

Jennifer Sensiba is a long time efficient vehicle enthusiast, writer, and photographer. She grew up around a transmission shop, and has been experimenting with vehicle efficiency since she was 16 and drove a Pontiac Fiero.

Jennifer Sensiba has 2254 posts and counting. See all posts by Jennifer Sensiba