#1 cleantech news, reviews, & analysis site in the world. Subscribe today. The future is now.


Autonomous Vehicles Tesla Autopilot lawsuit

Published on November 1st, 2018 | by Steve Hanley

0

Tesla Autopilot Lawsuit May Affect Autonomous Driving Rules

November 1st, 2018 by  


Let’s say you are driving down the highway at 80 mph when a stopped vehicle suddenly appears in your lane. What do you do?

A. Swerve?

B. Brake hard?

C. Crash into it and blame the autonomous driving software in your vehicle for the collision?

Tesla Autopilot lawsuit

If you are Shawn Hudson, the correct answer is C. Hudson was driving along a Florida highway in his Tesla Model S in early October, with Autopilot engaged and his speed set to 80 mph, when his car crashed into the rear of a Ford Fiesta that had stalled in his lane. Hudson’s car was heavily damaged as a result, although he escaped with no injuries.

Release The Hounds!

Unimpressed that his Tesla did a fine job of protecting him thanks to its superior crash worthiness, Hudson hired a lawyer and sued Tesla, claiming its Autopilot system failed to perform as advertised. “Through a pervasive national marketing campaign and a purposefully manipulative sales pitch, Tesla has duped consumers” into believing that Autopilot can “transport passengers at highway speeds with minimal input and oversight,” the lawsuit says, according to a report in ArsTechnica. Where have we hard this tale before?

Contacted about the suit, Tesla emailed this comment to ArsTechnica (we didn’t bother to reach out as well since Tesla always provides the same statements to all outlets in controversial cases such as this):

“We don’t like hearing about any accidents in our cars, and we are hopeful that those involved in this incident are recovering. In this case, the car was incapable of transmitting log data to our servers, which has prevented us from reviewing the vehicle’s data from the accident. However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed.

“When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner’s Manual and Release Notes for software updates.”

Word Of Mouth vs. Reading The Manual & Paying Attention To The Prompts

Stuff and nonsense, says Hudson in his suit. He claims he heard about the wonders of Autopilot and went to a Tesla store to find out more. “Tesla’s sales representative reassured Hudson that all he needed to do as the driver of the vehicle is to occasionally place his hand on the steering wheel and that the vehicle would ‘do everything else,'” the lawsuit claims. Hudson says he was “relaxing” during his morning commute at the time of the crash.

His lawyer, Mike Morgan, stated during a press conference announcing the legal action, “If this had been something more substantial than a Ford Fiesta, he wouldn’t be here. Hudson became the guinea pig for Tesla to experiment their fully autonomous vehicle.” He says the car’s owners manual states, “you can engage it over 50 miles an hour, but if you engage it over 50 miles an hour, it’s got trouble finding stationary objects and stopped cars. To me, that’s a big problem. To me, that means you’re selling nothing.”

“The Law Is A Ass”

No less a personage than Charles Dickens weighed in on the majesty of the legal system some years ago in Oliver Twist. In one passage, Mr. Bumble, a character in the novel, says, “If the law supposes that, the law is a ass — a idiot.” The redoubtable Mr. Bumble may have been right.

The history of liability law in the United States relates back to the famous case — beloved by law students everywhere — of McPherson vs. Buick Motor Company, in which future Supreme Court justice Benjamin Cardozo ruled customers could sue manufacturers directly for injuries related to the use of their products. Prior to that case, decided just over 100 years ago, manufacturers were insulated from liability because they had no direct dealings with the consumer, what the law liked to call at the time “privity of contract.”

They made stuff — like automobiles — which they sold to dealers. The dealers in turn sold them to the end user. Since there was no contractual relationship between the manufacturers and the consumer, how could they be held responsible if their products caused harm? It didn’t take Cardozo long to figure out the answer to that question and then make it the law in New York state. The entire field of American tort law grew out of the McPherson case, which opened the floodgates to the tsunami of tort law litigation that bedevils us today.

The False-Positive Dilemma

Tesla is not alone in struggling to design autonomous driving systems that can navigate on their own. The problem is designing them so they are not constantly applying the brakes whenever something unexpected happens. Today’s systems have difficulty discriminating between a piece of paper on the roadway or a plastic bag being blown across a travel lane and an actual car or — God forbid — a pedestrian. To get around that issue, most emergency braking and self-driving systems today are simply instructed to ignore those inputs — which are known as false positives — and continue on without pausing.

Tesla has been more aggressive about marketing its Autopilot than most, as have Tesla owners. As of this moment, Tesla has a video on the Autopilot page of its website that shows a person driving without a hand on the steering wheel. There are no disclaimers in the video advising viewers that they must remain attentive at all times. Elon Musk has gotten very upset with people who dared question the capabilities of Autopilot, arguing that such carping will result in more highway deaths if people get spooked by the negativity and decide to not engage Autopilot whenever they can.

The Beta Tester Conundrum

Attorney Mike Morgan, despite his grandstanding, has a point. While Tesla owners may arguably consent to being voluntary beta testers for the company (and even pay a premium for it), drivers of the other cars on the road — such as the owner of the Fiesta that Hudson’s Model S collided with — clearly are not part of the Faustian arrangement Tesla and its customers have entered into.

Now, here’s a fine legal point you aspiring attorneys out there can wrestle with. Can a Tesla employee expand the representations made by the company orally? Let’s take an example. A roofing company sends a representative to a home. The salesman presents the homeowner with a brochure promising a 10 year warranty on all new roofs installed by the company. During the sales presentation, the salesman says, “All our roofs come with a lifetime guarantee.”

When the roof starts leaking 10 years and one month after it is installed, is the homeowner covered by a warranty or not?

Courts have wrestled with issues like this for generations and there is no generally recognized answer nationwide. Will the state court in Florida rule the Mr. Hudson could rely on the statements supposedly made by the Tesla salesman even if they go further than the company’s written materials? We simply don’t know.

Volkswagen Proposes An Alliance Of Manufacturers

When an Uber test vehicle ran over and killed a pedestrian in Tempe, Arizona, earlier this year, it set off alarm bells throughout the automotive industry. According to a report by Automotive News, Volkswagen is quietly engaging in conversations with up to 15 other automakers about standardizing autonomous driving protocols. The purpose of the talks is to insulate the companies as much as possible from liability claims.

An unidentified VW executive told Automotive News, “When you are involved in an accident, you have a better chance in court when you can prove that your car adheres to the latest technical standard. How do you create an industry standard? Ideally by getting others to use the same sensor kit and software, so for that reason an overarching cooperation between automakers is one of the options we are examining. The question is: How do we bring products to market that guarantee we made ourselves as small a target for damage claims as possible? Law firms are already in the starting blocks,” the executive said.

When asked if such an alliance would be similar to the Ionity consortium that is bringing high-speed charging stations for electric vehicles to Europe, the executive said autonomy is “another level of complexity entirely.” Ionity is a partnership between Volkswagen, Mercedes, BMW, Ford, and Shell and is basically just about getting high-speed charging stations installed across Europe.

“The gates of history turn on tiny hinges,” my high school history teacher told her students constantly. In 1916, Mr. McPherson drove his Buick into a phone pole. The collision broke the steering wheel, which then proceeded to pierce his chest. The result of that small incident changed American liability law forever.

In 2018, Shawn Hudson’s Tesla Model S drove into the back of a Ford Fiesta on a Florida highway while its Autopilot system was engaged. Could that incident have a similar impact on self-driving technology both in the US and in other countries? “We’ll see,” said the Zen master.


Support CleanTechnica’s work by becoming a Member, Supporter, or Ambassador.

Or you can buy a cool t-shirt, cup, baby outfit, bag, or hoodie or make a one-time donation on PayPal.






Tags: , , , , , , , , , ,


About the Author

Steve writes about the interface between technology and sustainability from his home in Rhode Island and anywhere else the Singularity may take him. His muse is Charles Kuralt -- "I see the road ahead is turning. I wonder what's around the bend?" You can follow him on Google + and on Twitter.



Back to Top ↑