Published on May 25th, 2018 | by Steve Hanley0
Consumer Groups Say Tesla’s “Autopilot” Is Deceptive Marketing
May 25th, 2018 by Steve Hanley
Two consumer groups, the Center For Auto Safety and Consumer Watchdog, have sent a joint letter to the Federal Trade Commission alleging that the way Tesla uses the word “Autopilot” in its marketing is deceptive and potentially dangerous. The groups write, “Two Americans are dead and one is injured as a result of Tesla deceiving and misleading consumers into believing that the Autopilot feature of its vehicles is safer and more capable than it actually is.” (It might help the credibility of the Center For Auto Safety if it didn’t feature a lemon as part of its official logo.)
“After studying the first of these fatal accidents, the National Transportation Safety Board (NTSB) determined that over-reliance on and a lack of understanding of the Autopilot feature can lead to death. The marketing and advertising practices of Tesla, combined with Elon Musk’s public statements, have made it reasonable for Tesla owners to believe, and act on that belief, that a Tesla with Autopilot is an autonomous vehicle capable of ‘self-driving’.
“Consumers in the market for a new Tesla see advertisements proclaiming, ‘Full Self Driving Hardware on All Cars.’ They are directed to videos of Tesla vehicles driving themselves through busy public roads, with no human operation whatsoever. They see press releases alleging that Autopilot reduces the likelihood of an accident by 40%.
“They also hear statements like ‘the probability of an accident with Autopilot is just less’ from Tesla’s CEO, Elon Musk. Or they hear him relate Autopilot in a Tesla to autopilot systems in an aircraft. Such advertisements and statements mislead and deceive consumers into believing that Autopilot is safer and more capable than it is known to be.”
Musk has actually defended the use of the term Autopilot specifically because it is like aircraft autopilot. Perhaps a problem in that logic is that he knows much more about aircraft autopilot than the average Joe and realizes that it doesn’t mean the plane is completely flying itself. From Wikipedia: “An autopilot is a system used to control the trajectory of an aircraft without constant ‘hands-on’ control by a human operator being required. Autopilots do not replace human operators, but instead they assist them in controlling the aircraft.” Nonetheless, there’s a good chance the average person jumps to conclusions with such a term and assumes the technology is more capable than it is.
The consumer advocates buttress their argument by pointing out the Autopilot page on the company website features this statement: “Full Self-Driving Hardware on All Cars. All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.” That is followed immediately by a video of a Tesla driving in Autopilot mode that has these words on the screen: “The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”
Hoo boy. That is poking the bear big time. Elon Musk has called out journalists who dare say negative things about Autopilot, basically saying they will be complicit in the deaths of countless people if drivers are afraid of Autopilot and elect not to use it. But so far, the public has to take his claims that Autopilot is safer than a human driver on faith, since the data Musk sees is not available to mere mortals. That may change soon when Tesla begins releasing crash data on a quarterly basis.
This topic has generated a lot of interest here at CleanTechnica over the years, with many people supporting Musk in his efforts to make self-driving technology a reality and assuming that buyers should learn well enough from disclaimers and Tesla’s vehicle walkthrough upon delivery that Autopilot does not mean the cars are able (yet) to drive themselves without focused human oversight.
Others feel strongly that Musk tends to overhype Autopilot in much the same way he exaggerates production timelines (and like the letter above claims). I believe Musk critics make valid points, especially with Tesla automobiles plowing into the back of bright red fire trucks on multiple occasions while the system is activated. Sometimes, it seems like Elon doth protest too much.
“The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of,” a Tesla spokesperson tells the Los Angeles Times in an email. A spokesperson for the FTC says the agency takes all correspondence it receives seriously but declined to say whether it plans to open an investigation into the allegations.
The issue may have something to do with human foibles. A few months ago, a driver in the UK lost his license for 18 months after he was observed sitting in the passenger seat of his Tesla Model S and letting the car drive itself on the motorway. The driver seemed unrepentant afterwards, grumping that he was just unlucky to get caught doing what lots of other Tesla drivers do.
Some owners seem to enjoy finding creative ways of fooling the computer into believing their hands are on the steering wheel when they aren’t. Rumors abound of people hanging a small weight on the wheel or squeezing a rubber ball between their legs and the wheel while Autopilot proceeds serenely along. What Elon and his supporters seem to underestimate is the almost limitless ability of people to do absolutely stupid things for no other reason than they can. The fault, dear reader, may not be in our Teslas but in ourselves.