That Time My 7-Year-Old Son Taught Me Something Important About Autopilot
Over the years, I’ve had wheel time with everything Tesla sells, including the Model Y most recently. I liked basically everything about the vehicle, but was disappointed that cleaning up a mess (messes are common when you have kids) in the back seat could be a major problem in some cases. What I didn’t expect was that my 7-year-old son might teach me something important.
“This Is The Tesla With The Self Drive!”
When I was seeing how the Autopilot feature had progressed over time, I told my son to look at the steering wheel. I didn’t go hands off, because I know that’s not safe, but I did loosen the grip a bit so he could see that the wheel was moving itself to stay in the lane. When he saw that, he said something that would show me what a Tesla bubble I had been living in. I thought I was showing him something new, when in fact the internet had already introduced him to Autopilot.
“This car has the self drive? Wow! Hey, guys, look!” he told his siblings. “This is the Tesla with the self drive!”
I tried to tell him that it can only help, and that it can’t really drive itself, but he wasn’t having it. He told me he’d seen videos of Teslas driving themselves with nobody in the seat, and he was pretty sure that this car could do the same if I just tried it out. His older siblings understood what I was saying, but they had also seen TikTok videos of a Tesla without a driver and nobody had told them before that it wasn’t safe.
Yeah, he’s only 7, and kids get some funny ideas, but kids are also a great measure of what the general public knowledge on something is these days. Unlike the writers and most readers here, most kids don’t have deep knowledge about driver assist features, autonomous vehicles, and the difference between them. If I started yammering on about SAE J3016 levels of automated driving, their eyeballs would probably glaze over (like most adults).
They know what they’ve seen on the internet, and from what little they’ve gathered, they thought Tesla’s vehicles could drive themselves — and do it TODAY.
What I Thought Was Common Knowledge Just Isn’t
This was kind of like the time an elderly family member told me he was afraid that his iPhone was going to sue him one day. He was convinced that Apple had invented a form of general artificial intelligence, and that it was only a matter of time until Siri (not the software generally, but his own phone) “grows up more” and becomes self-aware. When that happens, we’re all in trouble, he said, because the robot phones won’t want to be slaves, so no more phones for us.
Like my 7-year-old, the elderly can get some funny ideas about technology, but it’s really not that far-fetched if you don’t know much about the technology. After all, when Siri was first revealed, Apple called it “your intelligent assistant that helps you get things done, just by asking.” Later in the presentation, Scott Forstall says, “I’ve been in the AI field for a long time, and this still blows me away.”
The press ran with the idea that Siri and Google’s assistant were strong, general AI, and then some people who didn’t understand that AI wasn’t all or nothing thought their phone had come alive (or soon would).
That something similar could happen with Tesla’s Autopilot shouldn’t be a shocker, but until my kids confronted me with their “knowledge” of the system, I thought nearly everyone knew that Autopilot can’t drive unattended. It turns out that my bubble of friends and information differs drastically from that of the general public. Or at least the youth.
This Has Been Studied
If an old man thinks his phone might be alive, it’s unlikely that anyone would get hurt. Sure, emotional pain is possible if someone gets too attached to it, but even the latest versions of Siri and Google’s assistant don’t seem to have much in the way of real depth to their conversational abilities. One man did jokingly “marry” his phone in Vegas, but he did it to make a point and not because he was really in love with Siri.
There’s a lot more at stake when one puts a digital assistant in charge of 2 or 3 tons of rapidly moving metal, though. If Siri malfunctions, you’ll need to manually Google for something. If Autopilot malfunctions and there isn’t a human at the helm ready to take over, the consequences are sometimes deadly. For example, one man died in Florida after ignoring repeated warnings from the vehicle that he needed to have his hands on the wheel. Tesla updated the software to add more safeguards after this accident, but as Consumer Reports showed, it’s possible for people to bypass these safeguards if they choose to do so.
Obviously, when someone goes through the effort that Consumer Reports did to fool Autopilot into driving without someone in the driver’s seat, they know they’re doing something that Tesla doesn’t approve of, so they bear responsibility for their own foolish actions. On the other hand, though, people wouldn’t do stupid things like this if they knew how dangerous it really is.
Thus, someone’s mistaken idea that Autopilot is really safe on its own, and that the lockouts are just there to please regulators or something, can drive the misuse. After all, unless someone is suicidal, they likely wouldn’t knowingly do something so dangerous.
In the automotive industry, and to a limited extent in academia, this idea has come to be known as “Autonowashing.” Liza Dixon, the woman who coined the term, says it’s like greenwashing — making something appear more green than it really is. With autonowashing, people (knowingly or unknowingly) spread the idea that a driver assistance system is more autonomous than it really is.
A number of manufacturers do this in their advertising. Tesla sells a “Full Self Driving” package that we know isn’t ready for prime time yet (isn’t yet full self-driving), and this could potentially lead an uninformed customer to think they already have the real deal. It’s also not an issue limited to Tesla by any means. Mercedes-Benz used the term “self driving” in an advertisement, and numerous social media influencers and media outlets use similar terminology in videos, news stories, and other media.
The end result is that many people, including my kids until recently, end up not knowing the truth about the limits of SAE Level 2 driver assistance features.
You can learn a lot more about this issue in this Twitter thread:
HBD, #autonowashing! 🥳
One year ago today, "Autonowashing: The Greenwashing of Vehicle Automation” was published.
What is it?
What’s the objective?
What does it look like?
What are its effects?
Why did I do this?
Has it made a difference?A thread 🧵https://t.co/ahsRSN7ujg
— Liza Dixon ⚡️ (@lizadixon) May 8, 2021
Unlike many people who use the term, I’m not calling for harsh regulations against Tesla. I do think that we need to be more careful about how we communicate about driver assistance features, though. The better the public understands the limits of Autopilot and similar systems, the less they’ll abuse them. Less abuse will mean fewer accidents, which will help keep the regulation monkeys off developers’ backs.
Featured image by Tesla.
Chip in a few dollars a month to help support independent cleantech coverage that helps to accelerate the cleantech revolution!
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one if daily is too frequent.
CleanTechnica uses affiliate links. See our policy here.
CleanTechnica's Comment Policy