California Family Sues Tesla Over Fatal Crash

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Nearly two years ago, Benjamin Maldonado was driving home from a soccer game on a highway in California with his teenage son, Jovani, in the passenger seat. Suddenly, the traffic in front of him slowed and Maldonado put on his right turn signal and started to change lanes. It was then he noticed a white Tesla Model 3 coming up fast on his right. He tried to swerve back into his lane but the Tesla collided with the rear of his Ford Explorer Cross Trac. His car rolled over and Jovani, who was not wearing a seat belt, was ejected from the vehicle. He died shortly thereafter at a local hospital.

His family is represented by attorney Benjamin Swanson, who obtained a 6-second video of the accident from Tesla after the crash. Oddly, Swanson has copyrighted the images from that video and shared them with the New York Times. CleanTechnica does not have permission to show those images or the video but they can be found on the Daily Mail website at this link.

Swanson has sued Tesla on behalf of the family. According to the New York Times, the complaint alleges that Tesla’s Autopilot system is defective and failed to react to traffic conditions. The suit also names as defendants the driver of the Tesla, Romeo Lagman Yalung of Newark, California, and his wife, Vilma, who owns the car and was in the front passenger seat at the time of the collision.

The video clearly shows Yalung’s Tesla Model 3 moving faster than the surrounding traffic and passing several cars on the right. The data from the car shows it was traveling 69 mph, then accelerated to 70 mph shortly before the crash. One second before the collision, it slowed dramatically and then crashed into the Explorer.

The Aftermath

First things first. We want to be very clear that what happened to Jovani is a tragedy for the Maldonados. No parents ever expect a child to die before they do. When it happens, it is a soul-searing experience that explodes their world and leaves a hole the size of the universe in their lives. The Maldonados are grieving and there is nothing the courts can do to assuage their pain. Money? Pffft. They would gladly part with all their worldly possessions if that would bring Jovani back to life.

The video quite clearly shows the Tesla going too fast for conditions. Why didn’t Romeo Yalung or Autopilot recognize the danger of driving that fast or see the Maldanado vehicle moving to the right until it was too late? Ryan McCarthy, an attorney for Tesla, notes, “The police faulted the Tesla driver — not the car — for his inattention and his driving at an unsafe speed.”

The New York Times contacted Raj Rajkumar, an autonomous driving expert at Carnegie Mellon University, who reviewed the video and data from the accident. [After a crash, all Teslas automatically save the video and data for a 6 second period of time prior to the collision.] Rajkumar tells the Times that Autopilot might have failed to brake for the Explorer because the Tesla’s cameras were facing the sun or were confused by the truck ahead of the Explorer.

Tesla recently removed radar sensors, but this accident occurred at a time when Tesla was still using it. Nonetheless, Rajkumar adds this enigmatic coda to his remarks: “A radar would have detected the pickup truck, and it would have prevented the collision. So the radar outputs were likely not being used.”

Was Autopilot Engaged?

There is nothing in the New York Times article that clearly states Autopilot was operating at the time of the collision. Tesla, in line with its normal practices, has declined all press inquiries.

People often confuse Autopilot and Tesla’s adaptive cruise control, which matches the car’s speed to that of the car ahead. There may have been more going on here than we know at the moment. All we can say is that the video of the Tesla rocketing along while the traffic all around it is slowing strongly indicates the driver was not paying attention to the road, something that Tesla urges all its customers to do at all times, whether Autopilot is engaged or not.

Comparative Negligence And Grandstanding

Personal injury law has long recognized the doctrine of comparative negligence. In most situations, no one party is entirely  blameless and no one party is entirely at fault. The courts try to determine the degree to which each party is responsible. For instance, if the jury awards an injured party $1 million but that person was 20% at fault, the actual amount recovered will be $800,000.

Let’s think about that for a moment. Jovani Moldanado was not wearing his seat belt. Did that contribute to his death? Should the passenger door of the Explorer have opened when it was hit in the rear? Arguably not, which suggests Ford could get dragged into this.

The fact that Ford was not made a defendant is curious. Usually, an attorney will sue every person or corporation under the sun. The idea is to get as many insurance companies as possible involved so they can fight about who should pay and how much. It’s how the law game is played. But in this case, the attorney has apparently given Ford a free pass.

Then there is the behavior of the attorney. Releasing the video to the New York Times shortly after filing suit is not customary behavior. It’s as if the lawyer is looking to try the case in the media to force a quick settlement rather than waiting for the case to wend its way through the courts, a process that often takes years. Is there a hidden agenda here we aren’t aware of?

The Problem With Autopilot

Let’s come right out and say it. Elon Musk has been warned over and over and over again that the term “Autopilot” is misleading to many people — even if this is indeed how autopilot in a plane works. Yet Musk steadfastly refuses to change the name, relying instead on warnings in the owners manual.

Humans have a built-in defect. We are suspicious of new technology at first, but once we develop a comfort level with it, we want to believe it will always work perfectly. Once a car starts driving itself, there is little reason to remain fully engaged in watching the road. Our attention wanders. Some people actually fall asleep. Others climb in the back seat to take selfies of the car driving itself.

Elon Musk is not a stupid man, yet he insists on doing things his way and no amount of protest can make him alter his chosen course. That is both his greatest strength and his greatest weakness. For years, he has decreed that the technology in his cars will not track the driver’s behavior. It would be so simple to observe a driver’s eyes for signs of drowsiness, inattention, or intoxication, but Musk wouldn’t hear of it. Although, that policy is evolving at last.

To be fair, Americans are killing themselves in record numbers in highway accidents. One can argue that more lives have been saved by Tesla Autopilot technology than have been hurt or killed by it. The problem is similar to battery fires versus gasoline fires. There are so many of the latter that the press has stopped reporting them. But let a battery in an electric car catch fire and the press immediately begins shouting from the rooftops about it.

The same thing is true of collisions involving self-driving cars. Every one is inspected, dissected, and scrutinized to a fare-thee-well. Now the US government requires every manufacturer to report any traffic accidents in which driver-assist technology plays a role. New regulations are being prepared as we speak, and when they appear, we will have Musk and his stubbornness to thank for them. A year ago, a German court ruled the phrase “autopilot” is misleading to consumers and banned its use. Musk was unmoved.

The Takeaway

The loss of a young person’s life is a horrible, traumatic event. Yet it seems there is more than meets the eye in this story. Personally, as a retired attorney, I am uncomfortable with the media offensive initiated by the Maldonado’s attorney. It feels like working the refs in a sporting contest. If your case is solid, let the legal process work as it’s supposed to. There is something unseemly about how the New York Times allowed itself to be manipulated in this way.

The full picture will emerge — eventually — and it’s likely the story will get amplified and clarified over time. The takeaway may be that people who misuse Tesla’s Autopilot or any other self-driving technology should face jail time. That just might get people’s attention and make technology abuse less prevalent. If someone is driving much faster than the cars around him and causes a fatal accident, why should he expect anything less?


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica.TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and doesn't really give a damn why the glass broke. He believes passionately in what Socrates said 3000 years ago: "The secret to change is to focus all of your energy not on fighting the old but on building the new." You can follow him on Substack and LinkedIn but not on Fakebook or any social media platforms controlled by narcissistic yahoos.

Steve Hanley has 5497 posts and counting. See all posts by Steve Hanley