There’s a new study from the Massachusetts Institute of Technology (MIT) which looked at Tesla owners using Autopilot and how they interact with it. The study, “A model for naturalistic glance behavior around Tesla Autopilot disengagement,” has already been the source of many headlines claiming that it found that all Tesla drivers become inattentive when using Autopilot. These headlines lead one to think that every single Tesla owner using Autopilot becomes inattentive.
Eyes On The Road With Autopilot On & Off
The goal of the study was to understand the visual behavior that simulates the glance pattern observed around driver-initiated, non-critical disengagements of Tesla’s Autopilot when used on the highway. The study analyzed glance data from 290 human-initiated Autopilot disengagements. Glance duration and transition were modeled with Bayesian Generalized Linear Mixed Models. The report noted the results in a quick summary:
“The model replicates the observed glance pattern across drivers. The model’s components show that off-road glances were longer with AP active than without and that their frequency characteristics changed. Driving-related off-road glances were less frequent with AP active than in manual driving, while non-driving related glances to the down/center-stack areas were the most frequent and the longest (22% of the glances exceeded 2 s). Little difference was found in on-road glance duration.”
The report concluded that behaviors change before and after Autopilot disengagement.
“Before disengagement, drivers looked less on-road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead.”
The report also concluded that these changes in glance duration and patterns suggest that there is lower visual attention to the forward road when Autopilot is engaged compared to after disengagement to manual driving.
“This shift in visual attention may require better driver management systems to meet manufacturer recommended use.
“However, it is unknown what is a sufficient level of attention with automation to meet or exceed the safety threshold of a conventional vehicle. Looking off-road does not, however, automatically imply distraction or inattention because driver behavior cannot be assessed in isolation of the driving situation.”
What The Study Found
In a nutshell, the study found that Autopilot makes Tesla owners feel safe enough to glance away from the road longer than normal. A driver may feel safe enough to look at the passenger next to them while the car is in motion. The question is: are they doing this carefully and safely? The study doesn’t answer that question one way or the other.
Autopilot, it seems, gives Tesla owners a sense of security that a normal driver wouldn’t feel while driving, and this isn’t good or bad, but it is something that needs to be taken into account by both Tesla and the drivers. Tesla’s focus is on safety first, and the company, which collects driving data to build FSD, is probably aware of this and using it to create a safer system.
A solution to the potential inattentiveness of Tesla owners using Autopilot is a nag system, which Tesla has in place. However, Tesla can’t stop drivers from buying weights to trick the system. Nor can it stop companies from selling such devices. Perhaps this is where regulation would be best applied: stronger fines or penalties for drivers found using such hacks. There are many videos on how to trick other systems such as cruise control as well, so this could certainly be a much broader issue to remedy.
In December 2020, there was a new bill in Arizona that would fine drivers $250 if they trick cars to drive hands-free. The Drive reported on this and pointed out that drivers have been trying for years to disable various warnings and lockouts that are meant to keep their attention on the road. The bill, however, died in committee and failed to be enacted into law.
If it had been made into law, Arizona would have taken a major step into putting the responsibility of paying attention to the road directly onto its drivers even more — and this is where it should be.
What I Would Like To See A Study On
I would like to see a study on how Autopilot has saved lives. The data is there and there are several videos of Tesla owners sharing accounts as to how the software saved them from being in an accident. CleanTechnica has covered some of these, but it seems that every day there is a life saved by Autopilot.
Yet, none of the major outlets that are quick to run with “Driverless Tesla Crashes” or similar headlines seem to have any interest in the lives saved by the use of Autopilot. A study comparing accident data — how many accidents happen with and without Autopilot engaged — and looking into how Autopilot saves lives could be useful.
Yes, there’s a problem. Tesla and any other maker of any type of ADAS will run into this problem. The problem is multifaceted. On one hand, people are easily distracted. So, Tesla implemented a solution to that: a nag. On another hand, people are selfish and don’t think safety features apply to them. They find ways to trick or even disable safety features. This isn’t Tesla’s fault, nor that of any company that creates a technology that people find ways to abuse. The problem is people will abuse it.
“Well, maybe Tesla shouldn’t make it then!”
Saying we shouldn’t create better systems because people will abuse them doesn’t help. The solution is to figure out a way to prevent or discourage people from abusing the system. However, regulation would need to be involved and this would mean cutting through a lot of red tape across every state. Even then, there will always be people who break the law, abuse systems designed to help protect us, and do overly dangerous things.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
EV Obsession Daily!
Tesla Sales in 2023, 2024, and 2030
CleanTechnica uses affiliate links. See our policy here.