Elon Musk is a fierce believer in the ability of Tesla Autopilot to reduce traffic deaths and injuries. His company was the first to offer genuine self-driving features on a regular production car when it began adding the hardware needed to every Model S manufactured on or after October 9, 2014.
The system gathers data constantly while an Autopilot-equipped Tesla is driving. Even if Autopilot is not engaged, it is working in the background, comparing what the computer would do with what the driver actually does. All that data is fed back to the company, where software engineers use it to improve the algorithms that make the self-driving system function.
Ever since Autopilot debuted, stupid people have done stupid things, trying to see whether the system really performs as advertised — and if it performs better than advertised. There were reports of a driver on the autobahn sitting in the back seat of his Model S reading the paper while the car drove itself. Drivers deliberately forced their cars into oncoming traffic just to see how the system would react. Recently, a driver in the UK lost his license for 18 months after he was photographed sitting in the passenger seat while the car drove itself. His response? He was “unlucky” to have gotten caught.
Anything that happens in or to a Tesla gets an outrageous amount of publicity. Every battery fire gets international press coverage, even though thousands of automobile-related gasoline fires occur every day that get no mention by such press at all.
The same thing has happened with Autopilot. Some deaths have occurred while Autopilot was engaged, beginning in May of 2016 when Joshua Brown was killed when his Model S collided with a tractor trailer on a Florida highway. Most recently, Wei Huang was killed on a California highway when his Model X slammed into a lane divider.
Following that incident, Tesla got into a rather public dispute with the National Transportation Safety Board (NTSB). Officials from the NTSB were annoyed that Tesla released a passionate defense of the Autopilot system that included some details the NTSB wanted kept confidential.
During the Q1 Tesla financials call on May 2 — during which Elon Musk’s behavior has been widely described as bizarre — Musk noted that every time there is negative press about Autopilot, drivers use it less often. “That’s not cool. That’s why I get upset,” Musk said. “The statistics are unequivocal that Autopilot improves safety.”
But how does anyone other than Elon and his engineers know how unequivocal those statistics really are? Whatever stats Tesla is collecting are not available to the public. But that is about to change.
As part of the earnings call, Tesla announced it would begin making crash data public and update it every quarter. But, as Bloomberg aptly notes, “The value of Tesla’s new metric … will depend entirely on how the data are provided.” Simple and clear enough, but it’s worth emphasizing.
“For Tesla’s new safety reporting to be useful, the methodology must be clear and fair,” Bloomberg argues. “Ideally, the company would offer its data and analysis for independent review. Plenty of skilled academics would gladly volunteer. If Tesla does this right, they would have that data at their disposal to show once and for all whether driving with Autopilot engaged is as safe on the same roads, in the same conditions, as driving without it.”
“It’s possible that Tesla will find that driving on Autopilot actually reduces safety — perhaps because drivers get complacent about their responsibility to watch the road. In that case, Tesla should own up to the shortcomings and see what it can do to keep drivers on task. There should be a public discussion about how much safety we’d be willing to sacrifice for the convenience.
“However, if Autopilot is truly as good as Musk claims — if driver safety actually improves when Tesla owners flip the switch — then this move to make the data public is critical for changing laws, behavior and the public’s perception of what’s safe and what isn’t. Musk … has total control over the information that can put an end to the debates and make our roads safer. Let’s see how he does it.”
In the abstract, data proves or disproves nothing. If Tesla actually steps up and lets the world see its Autopilot data, it could affect how self-driving technology develops not only in the US but in other countries as well. Most of us are hoping the data actually supports Elon’s claims that Autopilot saves lives. More than 100,000 people die in road accidents around the world every year. If Musk and company can make a dent in that carnage, more power to them.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Latest CleanTechnica TV Video
CleanTechnica uses affiliate links. See our policy here.