Recently, Tesla released a beta version of its Full Self Driving (FSD) software to some owners, and told us that the limited release would slowly expand to include more drivers. Like everyone else, I’m excited. Sure, we didn’t get flying cars by 2015, but it’s still neat to see robotic cars doing most of the driving by themselves. On the other hand, this is a time for caution. If just a few people misuse this experimental software they’ve been entrusted with, it could set things back in big ways.
Some of what I’m going to say here should go without saying, but when we consider that batteries come with warnings against eating them and sometimes even curling irons have a “don’t use on eyes” warning, we can’t be too cautious. There are always people who think they’ll have better luck than the rest of us, who will treat the system like it’s the final product or otherwise abuse it. Most of us need a bit of help learning to do things the right way anyway. The wider we can share safety information, the fewer problems we will see.
Even smart people and good drivers can benefit from reviewing this information. We need to keep in mind that, until now, people testing self driving systems have all undergone some kind of training. Even if you’re an excellent driver, follow all of Tesla’s safety warnings, and are committed to safety, it never hurts to review some of the pointers I’ve managed to dig up from those and other trainings.
Basic Safety Considerations (Don’t Abuse FSD)
“Beta” Means It’s Experimental
While many people know what “beta” means in the software world, we can’t assume everyone knows what this important warning means. Beta software is software that’s nearing completion, but isn’t quite ready for public release. It’s better than early process “alpha” software that’s likely riddled with bugs, but still not ready for “primetime.” What this means, ultimately, is that the software is still experimental. The only reason developers release “betas” is to ask for your help testing it in ways they didn’t expect or couldn’t do on their own, so that they can know what the bugs are and improve the software.
Even if you travel hundreds of miles with zero errors, don’t assume that it’s all good. They wouldn’t call it a “beta” if they were confident that you could risk your life on it.
Complacency Can Kill (or Send You To Prison)
Nothing illustrates this better than Uber’s deadly 2018 crash. A professional test driver, presumably with training and definitely paid to watch the road, got comfortable with the Uber test vehicle’s capabilities. Rather than looking out for danger, she decided to watch reality TV. Getting lulled into a false sense of security can happen to anyone, no matter how experienced. Like Uber’s test driver, you’ll get charged with crimes if the system malfunctions and hurts someone. It’s your responsibility.
No matter how good you think Tesla’s software performs, don’t make the mistake of trusting it. You have to ALWAYS have your hands on the wheel and be paying attention AT ALL TIMES.
Don’t Use the FSD Beta If You Can’t Drive
While we all eventually hope that self-driving vehicles can carry intoxicated or exhausted drivers safely home, we just aren’t there yet. If you’re drunk, get a ride home from someone who isn’t. If you’re too tired to drive, take a nap before you go. While a well-functioning FSD system would help, a “beta” FSD could leave you in a bad situation by doing “the wrong thing at the worst time.” Should that happen (hint: it will), you’ll be facing a challenge that a sober and well-rested driver would struggle to correct. Going into that fight with one hand tied behind your back is a terrible idea.
While I know it’s a touchy subject, age and disability are also factors. If you’re one of the many elderly drivers, take an honest appraisal of your abilities. The same applies for anybody with disabilities who drives. If you’re barely able to drive safely as it is, you probably shouldn’t be taking on the additional challenges of being an experimental test driver. I know FSD gives many people hope of keeping their independence and mobility, but the “beta” version of FSD isn’t ready to help you do that yet. Give the technology some more time to mature and let people who can do it safely help out for now.
In this section, I’m going to introduce drivers to some basic risk mitigation concepts and pointers from test driver safety training that I’ve been able to gather. This won’t tell you everything you could possibly need to know, but should help you get on the right track.
Prepare for High Risk/Low Frequency Events
A Tesla FSD system doing the “wrong thing at the worst time” is a great example of what safety consultant Gordon Graham calls a “High Risk/Low Frequency” event.
The human brain is great at doing things we’ve done before, and even better at doing things we do all the time, because we put our experiences in our “mental hard drive” to refer to later and get it right again. What we suck at are things we almost never have to do, and thus have no experience with. When it’s something that doesn’t matter much, it’s not a big deal to make a mistake. When the consequences are high, and we don’t know what we’re doing, we’re facing a serious problem. When doing something like beta testing a self-driving car, there’s no time to think it through.
The only way to get these “HR/LF/NDT” situations right is to plan ahead and give your “mental hard drive” something to call on. That’s called TRAINING.
New Drivers Shouldn’t Beta Test
If you’re a new driver, or have new drivers in your household, they’re not going to have much experience to draw on if something goes wrong. Only experienced drivers, with years of experience and many miles driven, should even consider activating the FSD beta. Here’s an example to drive home the importance of experience:
Or, the Hollywood version:
After the crash, the pilot said:
“One way of looking at this might be that, for 42 years, I’ve been making small regular deposits in this bank of experience: education and training, and on January 15, the balance was sufficient so that I could make a very large withdrawal.” —Captain Chesley “Sully” Sullenberger
You’ll notice the pilot sounded oddly calm during the ordeal. I don’t know about you, but I’d be totally freaking out if I had to land a plane on water like that. Fortunately, I don’t fly planes for a living and I’m glad that there’s always someone with experience in the cockpit when I’m on a plane.
That’s why we need experienced people to do things that could get dangerous. If you aren’t an experienced driver, you definitely shouldn’t be supervising a beta version of FSD software.
Visualization (Scenario) Training
The cheapest and easiest way to train is by thinking about what would go wrong and how you’d react. This helps with “HR/LF/NDT” situations by taking some time to “think it through” before an incident occurs. It also allows you to visually go through things that are impossible to safely practice, like the example scenario below.
For example, what would you do if your Tesla running the FSD beta changed lanes into opposing traffic?
For me, I know that paying attention and quickly taking over is going to be key to this. It wouldn’t take long for cars coming the other way to meet up with the front of mine, so the less time spent on the wrong side of the double lines should be minimized. I also know that if I jerk the wheel too hard back into my lane, I risk overcorrecting (and losing control of the vehicle), so any quick moves back to safety need to be done gently.
I’m sure you can think of many other things that could go wrong at the worst time, so be sure to spend some time thinking about them. Feel free to discuss them in the comments.
Adopt A “Forward Leaning” Attitude
While you don’t have to literally lean forward, your mind needs to be focused on the future while doing something as dangerous as testing FSD. You should constantly be on the lookout for things that are likely to challenge the software and lead to a mistake. Narrow bridges, road damage, construction, aggressive drivers nearby, and complex intersections should be thought about long before you are in the thick of them.
By anticipating challenges ahead while you drive, and focusing on them, you’ll be much better prepared to take over in the event of a failure. Perhaps more importantly, thinking ahead helps you stay alert and focused, even if you can’t anticipate every failure.
Consider Advanced Training
If you can, consider taking advanced driving courses. By learning your vehicle’s limits, and more generally what a vehicle behaves like at the limits, you’ll be better prepared for problems with FSD. It’s entirely possible that a wrong move on the system’s part at the worst time could put the vehicle out of control, or put you in a situation in which many drivers wouldn’t be able to maintain control. By getting training on how to deal with those situations, you’ll be much better prepared.
Do you think I’ve been helpful in your understanding of Tesla, clean energy, etc? Feel free to use my Tesla referral code to get yourself (and me) some small perks and discounts on their cars and solar products. You can also follow me on Twitter to see my latest articles and other random things.