We seem to have a never-ending loop of stories about Tesla Autopilot and Tesla’s budding “Full Self-Driving” package, which is really just underneath the broader Autopilot umbrella. Perhaps this article fits into that loop, but I think it covers a part of the story that is mostly ignored, and I think this is important context from a sociological or sociopolitical point of view.
PSA: Don’t Be Stupid
First of all, I’ll just get out of the way what I think are two obvious but important points: 1) Yes, anyone using the system should be responsible and use it as directed, adequately supervising the car and remembering that you are fully responsible for it. 2) I don’t see or hear many stories of people not doing so, aside from certain YouTubers doing YouTuber things — but you can’t take “idiot” out of “human,” and young males are especially prone to doing crazy, high-risk things for a simple laugh.
At the end of 2020, I think no one needs proof of the fact that logical thought and common sense are in much shorter supply than we’d like. And I think that is one reason why many people are scared of Autopilot upon learning a little bit about it — they think many drivers will abuse the system.
Abusing the system can be a genuine problem at this point, because people can implement “cheat devices” and stop paying attention. Though, that’s a really stupid thing to do, and I hope and assume it’s very uncommon. Also, note that some people apparently even do this with much more basic cruise control. So, like I said, you can’t expect 100% of people won’t do something idiotic. But that’s not a reason to make much more advanced, better tech unavailable to consumers. You can also do idiotic things with blenders, power tools, and cell phones. Rather than outlawing them, you make it illegal (if it wasn’t already) for humans to do dangerous things with them.
But Is The Tech Too Clever By Half?
I think the bigger concern many who fear and want to limit Autopilot/FSD have is the concern that the tech isn’t perfect but people get lulled into thinking it is — thus, making it dangerous. Jennifer Sensiba rightly highlighted in a recent article that the IIHS has determined that driver attention has a tendency to slip as drivers get used to good, “Level 2” semi-autonomous driving tech. Years ago, I saw a similar conclusion from NASA — that even highly trained and motivated engineers could not keep their attention focused on something for very long if there was nothing for them to really do, nothing to correct or change. The mind has a tendency to wander, much more than we’d like to think. We are not good at carefully monitoring something for long if it mostly doesn’t need monitoring. Waymo also noticed this and decided to make its self-driving tech super duper safe in limited environments.
Having had the first version of Tesla Autopilot (for which this might have been the case) and the current version (but not the FSD beta), I actually don’t think this is something to be concerned about with the packages Tesla currently offers — for reasons that I’ll try my best to explain for those who don’t use the system every day. But before getting into that, I would like to highlight an older automobile safety advancement and how reactions to it bring some light to what we are seeing today with Autopilot/FSD backlash.
Are Seatbelts Really Safer? Or Do They Increase Risk?
Everyone knows that seatbelts improve safety and should be worn at all times. Or, wait, do they?
When I was younger, I remember hearing that it was actually safer in the back seat to not wear a seatbelt. Getting thrown into the back of the seat in front of you was supposedly less dangerous than having a firm seatbelt do damage to your waist and perhaps beyond. Whether that was true at some point or not based on very old-school seatbelt design, it definitely isn’t true today, but many people hold this misinformation in their heads. Other people hold other misinformation and faulty logic in their heads.
A 2017 study reported that 4 out of 5 people admitted to not wearing seatbelts in the back seat on short trips or in ride-hailing vehicles (e.g., Uber and Lyft vehicles). According to a 2015 study, about 11.5% of Americans said they didn’t wear seatbelts, while approximately 48% of automobile deaths were deaths of people who were not wearing seatbelts. The Wired article noting that adds the following context:
“Why? ‘Some people dislike the government telling them what to do,’ says Russ Rader, who heads up communications for the Insurance Institute for Highway Safety. Refusing to buckle up can be a political statement. Some, like your horrible high school boyfriend, complain the things are uncomfortable, or that other methods keep them safe, or that they don’t want to get trapped with one on.
“Some people can’t click it for medical reasons. Others just forget, or have pseudo-rational rules about when belts are appropriate—not on short trips, not when it’s sunny out, not when they’re passengers, and so on. (Never mind that more than half of fatal crashes occur within five miles of home, and that unrestrained passengers not only die in crashes but become flesh-and-bone projectiles, taking buckled-in drivers out with them.)”
More reasons people don’t wear seatbelts can be found here and here. The specifics are interesting, but not especially relevant to Tesla Autopilot. What’s relevant is that 1) some people are always going to have an aversion to new tech, and 2) they will find various ways to rationalize that aversion. Even in the case of seatbelts — which aren’t all that new and have an enormous amount of data backing up the point that they improve safety — a sizable portion of people think it’s less safe for them to put on a seatbelt in the back seat than to not do so.
We’ve shown before that people driving with increasingly advanced levels of Tesla Autopilot are much less likely to get into an accident (see the chart below). There are indeed some significant issues with the data comparisons there — the Tesla drivers should be compared to drivers in the same automobile classes, with the same age of vehicles, and on the exact same types of roads. However, just looking at the figures for the different levels of Tesla Autopilot software, the more advanced versions are correlated with a lower chance of accident. This is nowhere near the statistical level of proof seatbelt-related data offer, but it is definitely leaning in the right direction.
But Doesn’t Tesla Autopilot Make You Ignore The Road? No — It Makes You More Attentive.
Coming back to the biggest concern people have about Tesla Autopilot, introduced in the “But Is The Tech Too Clever By Half?” section above, let me explain a bit more about Tesla’s system today and why I think it improves safety rather than lulling the driver to sleep.
Actually, anyone who has spent much time driving knows that you can zone out while driving, and a large portion of drivers have fallen asleep briefly while driving at some point (or many times). From my experience, Tesla Autopilot does the opposite. I know, that’s confusing when you think about it on a basic level, but I’m not joking or playing word games here. There are three reasons for this:
1. If you have Autopilot engaged, Tesla quite frequently makes you move the steering wheel. I keep my hands on the wheel, but apparently so softly that I still get these requests to put some pressure on the wheel. These reminders actually remind you to pay attention to the road.
You can easily zone out while driving without Autopilot, but you won’t get any such reminders. If you have Autopilot on and zone out, the system will joggle your attention before long — and if you’re really out and don’t notice the visual prompt to move the wheel a bit (a note on the screen and pulsing blue light), after a moment, the car will beep very loudly at you. The reminders to move the steering wheel are frequent reminders to pay attention to the road.
(Admittedly, in the 2015 version of Autopilot I once had, such prompts were much less common. While the current system is a bit more annoying, I do think it adds considerably to driver and passenger safety.)
2. The system is not perfect. What this means is that you know you have to watch out for errors, and you also learn to watch out for a handful of other things, which again increases your attention on the road and on driving.
For example, if another car crosses my lane close enough to my driving path, the car will beep like crazy at me. It happens even when there’s no real risk of driving into the car crossing my lane — when it’s plenty far away for me to maintain my speed. After having my heart jump a few times from the loud warning, I’ve learned to pay extra close attention to any cars going across my lane even rather far away from me. The system makes me more attentive to cars crossing my path. So, yes, it is making me a more hawk-eyed, attentive driver rather than lulling me to sleep.
Another example is that the car doesn’t avoid potholes (yet). Knowing this, while I drive with Autopilot on, I actually have to focus more. This sounds illogical, but stick with me a moment. If I’m driving by myself, I’m likely to avoid potholes almost without thinking about it. It’s a natural instinct to just drive around potholes. With Autopilot on, I know that I need to pay extra close attention to upcoming potholes (or anything else that may be on the street) because I need to disengage Autopilot early enough and then drive around the potholes if I don’t want the extra bumps (I don’t) and don’t want to jerk the car to get around them.
There are other places, too, where I know the system is not totally smooth or natural, making me pay close attention in order to disengage and switch to normal driving when that’s better. I’ve determined it’s pretty much like a game. I now have to choose where to engage and where to disengage in order to achieves the smoothest and most enjoyable drive, which makes me more attentive to the road.
3. Tesla Autopilot gives me much more ability to scan the road and surrounding area. There is no doubt about it — Tesla Autopilot is exceptionally good at keeping you in the lane and changing lanes when directed to do so. Not having to worry about drifting into another lane, or steering into a car, the driver’s mind and eyes are given much more freedom to look around more and watch for unexpected, extreme risks. You can get a better view of the entire field, to put it in soccer/football terms. Staring a bit less at the lane markings in front of you doesn’t mean not paying attention; it means being able to pay attention to more things, more of the roadway, more potential cross-traffic coming up or fast-approaching cars from the rear.
Staring at the road in order to stay in the lane has a kind of numbing effect, part of the reason people have a tendency to fall asleep at the wheel. Not having to do that can improve alertness and reduce fatigue, two things that help a human drive more safely.
Yes, there are things to watch out for — “phantom braking,” tricky intersections, confusing lane markings — but until the FSD system is basically perfect, that’s part of the deal. Having all of these attention-improving and safety-enhancing benefits is a clear step forward, though, and they seem to decrease risk by somewhere between 2 to 10 times. I personally feel like Autopilot makes me a better and safer driver, and I have heard or read the same from many others. Of course, part of the equation is that we aim to be better, safer drivers. If our goal was to be thrill-seeking risk takers pushing the limits of technology to the brink, that would be a problem, but the solution for such people or behavior is laws against recklessness, not stifling helpful technology.
There’s actually more to write about all of this, but I’ll punt additional points to future articles.
If you’d like to buy a Tesla and get some free Supercharging, feel free to use my referral code: https://ts.la/zachary63404. You can also get $100 off of a Tesla solar product using that referral code. No pressure.