As a Tesla driver who has used Autopilot plenty (for years) and has had FSD Beta for a few weeks, let me try to answer this question first. I’ll offer my 2 cents on what I think is going on here. Then you can read Jennifer’s piece on various data, studies, and logic that explain why Level 2 driver-assist systems can lead to drowsiness and distraction.
First of all, a big part of the reason why those of us using these systems don’t tend to get drowsy or distracted while using them is that using them is somewhat like a stimulating video game, rather than a mundane driving chore. We are watching for where the system works well, where and why it doesn’t, what has changed with each software update, and how far we are to robotaxi-level driving — all interesting stuff that you might say adds some “gamification” to driving and makes us focus more.
Secondly, the system isn’t close to robotaxi-level autonomy. Perhaps some of the videos you see on Twitter or YouTube make you think it is, but I think anyone who has used the systems — especially FSD Beta — knows that the tech in our cars is not close to robotaxi capable. There are so many things we know the car can’t do well that we’re constantly on the lookout for them in order to 1) do them well ourselves or 2) see if the system has learned a bit and can all of a sudden do them like a human or better. For example, just using non-FSD Autopilot, I know that the system doesn’t avoid potholes. So, to avoid potholes, I have to closely watch the road and disengage Autopilot before going over one — actually, I have to do so early enough that I can go around it without jerking the car (which isn’t really a good alternative to going over a pothole in many cases). Avoiding potholes when fully driving the car myself is almost mindless — it’s second nature. However, with Autopilot on, it’s a kind of game, I have to pay more attention to act appropriately in time, and I also have to actually do more physically — disengage (I prefer to do so by pushing the stalk on the right of the steering wheel up), take back full control of the steering and acceleration, and steer around the pothole.
There are various other little things like this. Also, on the plus side, because I don’t have to focus nonstop on keeping the car in the lane, I do get less fatigued (everyone who has used Autopilot for a bit should notice this) and I have more freedom to scan the roadway around me, scan the crossroads, etc. — all of which is often more interesting and stimulating than trying to constantly make sure my car stays right between two white lines. (Yes, the latter is easy, but it’s also more tiring than many of us realize and assume.)
Just on the topic of FSD Beta, the system can be such a nutjob that it’s truly several levels above normal driving in terms of stimulation. You have to be super focused and alert, and you know it. Frankly, whereas basic Autopilot reduces fatigue a lot, I have no doubt that FSD Beta adds significantly to it.
I am clearly a bit biased, though, since I use the system. I think I also have the experience to answer the questions well, but certainly not for all situations and cases. Some people may use Autopilot for long empty stretches of roadway in which there is nothing to really do and they become drowsy and distracted. I could imagine that happening. To provide some other thoughts and information, let’s roll into Jennifer’s piece.
Oh yeah, I’ll also first note that I saw years ago a note that NASA determined several decades ago that even a determined, well trained, intelligent engineer couldn’t stay focused on something for long if there was nothing to do/fix. (Anyone who has seriously tried meditation knows this as well.) That has always been a warning against an autonomous driving system that is very good but not good enough to turn into a robotaxi. The point that is missed at this stage, in my opinion, is that someone using Autopilot/FSD Beta doesn’t have nothing to do, as I explained above — except perhaps in scenarios or places where this problem does arise, like driving for a long time on an Interstate highway where no real interaction or correction is needed for very long stretches of time. I’m yet to experience such a drive, but I presume some might be possible.
Okay, that’s really the end of my comments. —Zach Shahan
In past articles, I’ve written about what I call The Level 2 Attention Problem. The Insurance Institute for Highway Safety (IIHS), like other who had conducted a number of previous studies, found that people get distracted when a level 2 system does the steering and throttle, and get more distracted as they get comfortable with the system. NASA has also studied human-machine interactions over decades, and recently found the same thing again. The last link also references a University of Utah study that put random people behind the wheel of Teslas running Autopilot, and many of them were sleeping. Worse, they weren’t even aware that they were sleeping behind the wheel.
I’ve personally experienced this when using different L2 systems, including almost 5,000 miles on a Kia L2 system that functions very similarly to Autopilot. After a while, it becomes easy to “zone out” while the car does the steering and throttle control, and I found myself trying to fall asleep a number of times. What I realized pretty quickly was that taking the steering task over and leaving adaptive cruise control on was usually enough for my brain to wake back up and feel alert again.
Some People Don’t Believe Me, Or The Studies
Every time I’ve brought this up, readers would comment that they’ve never experienced this in their Tesla. Supposedly, the use of Autopilot makes them feel more awake and aware than usual, and they feel that they’re a much safer driver with the system than without, even on long trips. Even people I respect tell me things like this:
I’m a much better driver with LV2. Way less fatigue. EV charging stops help too.
— 🐶Earl of 💎 🐾 💰 (@28delayslater) October 27, 2021
I’m not of the opinion that Earl is a bad guy who would intentionally deceive people about his experience about this, and many other Tesla owners who I have respect for are also saying the same things. But, at the same time, it flies in the face of what I’ve experienced and what the studies have shown.
How Can This Be?
The way I figure it, there are three possibilities:
- Tesla fanatics are lying about this.
- Tesla fanatics are actually getting drowsy, but don’t notice that it’s happening.
- Tesla fanatics are somehow different from a random sample of the population.
- [Editor’s note: Or, option #4, what I wrote at the top.]
Things For And Against The Lying Hypothesis
Occam’s Razor would tell us that this one is the one to focus on, because it’s the simplest explanation. Megafans tend to really, really care about the company, and I’ve seen Tesla fans defend things that were obviously wrong, like building a fossil fuel power plant, because they really admire and look up to Elon Musk. If they’re willing to betray their beliefs about the environment to defend Elon Musk, then they could probably do anything, including tell lies, to defend him and the company.
There’s also the financial motivation to lie to defend Tesla. Many superfans own a lot of Tesla stock. Some own Tesla stock exclusively, and tell me that they think diversifying their portfolio is dumb. After all, Tesla is great, and has a bright future, they say. I personally think this is a bad idea, to stake one’s whole future on the performance of one company, but that’s not my call to make for others.
I’m just saying that if someone’s whole future and retirement is tied up in Tesla, they’d have a hard time not taking the company’s side at every opportunity.
BUT, I do have people I respect telling me that they’re experiencing greater levels of awareness and less distraction when using Autopilot, so, for the sake of argument and out of respect for these people, I’m going to discount this possibility for now.
For & Against The Unaware Slumber Hypothesis
The University of Utah study shows that people, even in a Tesla, fall asleep and don’t even know that it happened. The university said they halted the study, but then figured out how to make it safer, though. Researchers and lab assistants say they don’t think the changes were enough, but it’s possible that the original study was flawed and the subsequent study’s results aren’t available for comparison.
So, for the sake of argument and because there’s some reasonable room for doubt here, let’s shelve this one.
Editor’s note: It’s actually a ridiculously high number/percentage of drivers who report falling asleep once in a while during driving. We’ve written about that here. It’s a problem that goes well beyond Autopilot, and it may be amplified by Autopilot in easy driving situations like I noted higher up, but confirming that would require good scientific research that controlled for several key factors/potential influences.
Possibility #3: Tesla Fans Really Are Experiencing Something Different
One other possibility here that I think merits further investigation is the possibility that Tesla fans are experiencing something different than the randomly-chosen people who have been in these studies.
And really, this is a reasonable possibility. One of the big problems with polls and studies over the years has been the issue of non-random sampling. If you really want to know what the general population thinks or experiences, getting a truly random sample will give you a very good idea of that without talking to millions or billions of people. That’s why polls are generally pretty close, and not entirely off all the time.
But then again, we’ve seen that polls can be catastrophically wrong at times. Why? Because they didn’t get the random sample they were hoping for. For example, a polling outfit can call random home phone lines, but this would exclude younger people who don’t have a home phone line, so the poll tells us more about old people than it does about the general adult population. Other things like time of day, the use of mail, and even the wording of questions can all cause the results of a poll or study to be off.
My Theory — You Guys Aren’t A Random Sample
I’d argue that Tesla fanatics aren’t a random sample. People who really, really care about the company and the price of its stock are obviously going to be in a very different state of mind than the average “normie.” For the average normie hired by researchers to drive a Tesla on a boring highway in Utah, there’s just not much excitement or investment there. If they fall asleep or get distracted, and the system keeps them from becoming aware of this, they don’t care that much. People who are deeply mentally and financially invested in the company would usually (but not always) be more careful about what they’re doing.
In this way, the Safety Score might really be filtering people in a different way than Tesla intended. Instead of finding the safest drivers, they’re finding people who are the most motivated to participate in the FSD Beta program.
Me? I’m not interested enough in painting Elon’s fence to put in the time or effort to pass some safety score. I’m personally not that interested in owning a vehicle that drives itself unless there’s a bed, a bathroom, and a kitchen in the back. So, I don’t do what it takes to get a Model 3 and ace the Safety Score.
To determine whether any of this is true would require further study, but I think there’s a solid possibility that you folks who insist that Autopilot makes you more awake and aware might actually be telling me the truth.
Editor’s note: All of the notes I added above were not seen by Jennifer before writing this (of course). I’m curious to see how she responds, but in any case, she brings up some important and potentially relevant thoughts.