Is It Right For Tesla To Publicly Release Individual Driver Data To “Disprove” Claims Of Autopilot Malfunction? (POLL)

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Tesla has in recent years demonstrated a willingness to release individual driver data to “disprove” claims made about the malfunctioning of the company’s Autopilot feature (or just general malfunctioning of the vehicle). Does the company have the “right” to do so? More specifically, I’m not talking about the legal standing to do so, but about whether or not Tesla has the moral right to do so?

I’ve heard from Tesla owners on both sides of this debate — from those who think that driver data should belong solely to the driver, and from those who think that Tesla has the right to use the data however it sees fit.

I’m writing this article right now because I’m a bit curious to see what the broader community of Tesla owners (and non-owners) think — partly owing to the success of the survey we did on the possible need for a HUD or instrument cluster in the Model 3 (results coming soon).

The idea for the article was kicked off by a piece that I saw on Teslarati, which has a few parts that I want to highlight here:

“Tesla vehemently stands behind the safety and reliability of its cars, citing how its ‘Autopilot has been shown to save lives and reduce accident rates.’ That comment came as result of a request from The Guardian. In explanation as to why Tesla releases individual driver information to the media, the Tesla spokesperson added, ‘We believe it is important that the public have a factual understanding of our technology.’ …

“Tesla feels it has an explicit corporate need to stand behind its driving-assist Autopilot technology through public disclosures of individual driving data when a crash occurs. Individual Tesla drivers, on the other hand, express a desire to maintain the right to information privacy regarding their driving performance. And, while Tesla has disseminated individual driver information to the media following Tesla crashes involving its Autopilot system, it continues to deny data sharing with individual customers. Moreover, the company does not follow the commonly accepted research practice of gaining permissions from study participants prior to including them in a data set. …

“The technology available within a Tesla can provide information about the location of a driver’s hands on the steering wheel, if and when a driver’s door opens, and, importantly, the engagement and performance levels of autonomous technology. Tesla insists that it only releases specific driver data to the media when information has been misrepresented to the public.”

Even if that’s the case, though, that degree of information gathering is likely to make some owners wonder what Tesla may do with it in the future. Presumably, auto insurance companies would be happy to have access to the data, right? Who else?

Tesla’s position is understandable, though, as people love to blame others for their own mistakes. And Autopilot makes a perfect scapegoat, doesn’t it? “It wasn’t me, it was the computer.”

So, I’m of two minds on the matter. What do you think?

Create your own user feedback survey


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica.TV Video

Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

James Ayre

James Ayre's background is predominantly in geopolitics and history, but he has an obsessive interest in pretty much everything. After an early life spent in the Imperial Free City of Dortmund, James followed the river Ruhr to Cofbuokheim, where he attended the University of Astnide. And where he also briefly considered entering the coal mining business. He currently writes for a living, on a broad variety of subjects, ranging from science, to politics, to military history, to renewable energy.

James Ayre has 4830 posts and counting. See all posts by James Ayre