Tesla Adds Driver Monitoring To Interior Camera
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
As part of the latest software update, Tesla added driver monitoring capability to its Autopilot driver assist feature. From what we’ve seen, it appears to be a fairly robust system. This comes after a largely dishonest campaign of misinformation and a notable case of abuse.
Well well well. Look what we have here. pic.twitter.com/BakZlRfDFz
— Whole Mars Catalog (@WholeMarsBlog) May 27, 2021
“The cabin camera above your rearview mirror can now detect and alert driver inattentiveness while Autopilot is engaged. Camera data does not leave the car itself, which means the system cannot save or transmit information unless data sharing is enabled,” the update’s information screen says.
This Seems Aimed at an Increasing Tempo Of Safety Concern and Misinformation
Tesla has been in a lot of heat with media and safety advocates over the last year about this. Late last year, the pace of criticism picked up, with writers calling for the U.S. government to “do something” about Tesla’s driver assist features. Autopilot was already getting a trickle of criticism, but the release of the FSD Beta got people more riled up.
The big concern? Studies have shown that driver attention tends to start slipping after a driver becomes accustomed to a system like Autopilot, and that their attention may start slipping when they begin to trust the system too much. For this reason, most other manufacturers add some sort of eyeball monitoring system to vehicles with autosteer functionality so it can alert drivers who aren’t paying attention, and possibly take the feature away from them so they won’t abuse it.
At the same time, though, I and others argued that the threat was largely hypothetical. Despite a great many miles driven, there have been very few wrecks due to inattentiveness, and most of them were the result of people very intentionally abusing the system and not just drifting into lala land. The concerns raised seem legitimate, but we haven’t really seen it create accidents.
Later, NTSB sent a letter calling for NHTSA to regulate driver assist features more, citing concern over Tesla’s implementation of Autopilot.
The last few months have been a lot more wild.
One of the big events that kicked this into overdrive was the crash of a Tesla Model S near Houston, Texas. The investigation ultimately found that there was indeed a driver in the vehicle’s driver seat, that no driver assist features were in use, and that the occupants probably weren’t even wearing seat belts. Unfortunately, it took weeks for that information to fully come out, and an early misstatement by a local Constable led media to run with the idea that there was no driver behind the wheel.
While nearly everyone ran with the idea that there was no driver behind the wheel (because that’s what a public official initially said), the most wild example was Consumer Reports’ “testing” of Autopilot. They took the vehicle to a closed track and showed that the feature can be abused by buckling the seatbelt with nobody sitting, sit on top of the belt, and then hang a weight from the steering wheel. These steps bypassed two of Tesla’s safety lockouts to prevent exactly such abuse.
Worse, a moron in San Francisco got arrested for abusing Autopilot, after he had been doing it for months. Despite the arrest, he continued doing it again after getting out on bail, showing off to a local news crew that he was riding in the back with no driver again, and nothing is stopping him from continuing to do this until his court date in July.
With all of this pressure and bad publicity, Tesla must’ve decided they needed to implement yet another safety feature to keep people from abusing Autopilot.
From What We Know So Far, The System Seems Pretty Good
Tesla’s camera monitoring system seems to have been in some sort of testing for months. A hacker who goes by the name “green” on social media has messed with the feature in the past, getting data from the car’s computer on what it’s seeing, which he then superimposed over the video.
The information from the Tesla’s computer updates very rapidly, making it hard to see what’s going on. For that reason, it’s a good idea to pause the video to see what the computer concludes at any given frame. On YouTube, you can advance frame-by-frame by using a computer’s period and comma buttons.
What we see is a computer trying to make qualitative judgments about what it sees, and it can only express them with probabilities. A high probability of “Driver Eyes Nominal” will probably be what allows Autopilot (and likely FSD Beta) to continue operating. If that statistic goes down, and other statistics like “Phone Use” or “Driver Eyes Closed” is instead high, the system will probably put out a warning or shut off the same way as it does when you ignore “wheel nag.”
It’s also worth noting that you can’t easily fool it with photos. Putting up a picture of Elon Musk leaves it thinking that the chance of good attention is only about 1.7%, and won’t allow Autopilot to continue.
Good statistics like this and good real-world performance are two separate things, though. We will have to wait and see how well the system works for drivers before we decide on whether it’s a good system. We also don’t know if there are still ways to fool it, perhaps in much the same way Homer Simpson tried to fool a court:
My guess is that the system will be difficult but not impossible to fool.
This Was Done Voluntarily
While the critics are already saying this was done too late, it’s important to note that the government didn’t have to clamp down for Tesla to do this. Instead of calling for Tesla to implement DMS, I saw many people calling for government agencies to force Tesla to do it. The idea of voluntarily adopting these things, or taking alternative measures to address the risks of abuse and autonowashing, didn’t seem to be anywhere on their mental map.
In reality, this shouldn’t even be necessary. A system that requires you to take at least two intentional actions (seat belt, wheel weight) to abuse is really the responsibility of the abuser. Sadly, people will always find ways to surprise us with their stupidity, and Tesla had to do this to stop them.
This Ain’t Ovah
We shouldn’t assume for a second that this system will put an end to all abuse. Even if a picture of a person or images of fake eyes on glasses don’t defeat the system, stupid people will always find a way to mess with safety lockouts.
We need to continue taking action against abuse. The abusers should be prosecuted and not just left out on the streets when they continue to refuse to follow the law. People need to remind each other that driver assist features like Autopilot aren’t “self driving” or autonomous, and we need to speak out against companies that autonowash.
Chip in a few dollars a month to help support independent cleantech coverage that helps to accelerate the cleantech revolution!
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one if daily is too frequent.
CleanTechnica uses affiliate links. See our policy here.
CleanTechnica's Comment Policy