Tesla Autopilot in action stopping at a stop sign. Photo by Zach Shahan, CleanTechnica.

Washington Post Asks Why Tesla Autopilot Can Be Used In Places Where It Shouldn’t Be





Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

An article in the Washington Post on December 10, 2023 asks, if Tesla Autopilot is not intended to be used on roads that have cross traffic, why does Tesla allow it to activate on those roads? It’s a fair question, one that involves a number of factors — from the attitudes of federal regulators to the desires of the world’s wealthiest man. Around the water cooler at CleanTechnica’s ultra-modern juice bar, there is plenty of disagreement on this issue. The fact that the Post story has only been live for about 12 hours and has over 3,600 comments attests to the level of interest — and controversy — associated with this issue.

Autopilot On A Dark, Lonely Road

The story begins with a young couple who pulled their Chevy Tahoe off the road near Key Largo to look at the stars. The Post says, “A Tesla driving on Autopilot crashed through a T intersection at about 70 mph and flung the young couple into the air, killing one and severely injuring the other. In police body-camera footage obtained by the Washington Post, the shaken driver says he was “driving on cruise” and took his eyes off the road when he dropped his phone.

But the 2019 crash reveals a problem deeper than driver inattention, the Post says. It occurred on a rural road where Tesla’s Autopilot technology was not designed to be used. Dash cam footage captured by the Tesla and obtained exclusively by The Post shows the car blowing through a stop sign, a blinking light, and five yellow signs warning that the road ends and drivers must turn left or right.

You may have an opinion about Elon Musk, Tesla, and the vaunted Autopilot technology and if so, good for you. But watch that video and then answer these questions:

  • Why was Autopilot able to be activated on that road?
  • Why did Autopilot fail to recognize a T intersection marked by a stop sign, a blinking light, and five yellow signs?

AutopilotIn user manuals, legal documents and communications with federal regulators, Tesla has acknowledged that Autosteer, Autopilot’s key feature, is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” Tesla advises drivers that the technology can falter on roads if there are hills or sharp curves, according to its user manual.

Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software. The question is, why? If the car knows where it is and what road it is on, why is there not a provision in the software that prohibits Autopilot from engaging in circumstances where its use could be dangerous for drivers, passengers, and pedestrians?

Federal Regulators Disagree

Part of the answer may lie with an internal dispute between the National Transportation Safety Board and the National Highway Traffic Safety Administration. After the 2016 crash that killed Tesla Model S driver Joshua Brown, the NTSB called for limits on where driver-assistance technology could be activated. But as a purely investigative agency, it has no regulatory power over Tesla. NHTSA, which is part of the Department of Transportation, has the authority to establish enforceable auto safety standards, but its failure to act has given rise to an unusual and increasingly tense rift between the two agencies.

In an October interview, NTSB chair Jennifer Homendy said the 2016 crash should have spurred NHTSA to create enforceable rules around where Tesla’s technology could be activated. The inaction, she said, reflects “a real failure of the system. If the manufacturer isn’t going to take safety seriously, it is up to the federal government to make sure that they are standing up for others to ensure safety,” but “safety does not seem to be the priority when it comes to Tesla.” Speaking of NHTSA, Homendy added, “How many more people have to die before you take action as an agency?”

That sure sounds like a shot across the bow of NHTSA. In response, the agency  said it “always welcomes the NTSB’s input and carefully reviews it — especially when considering potential regulatory actions. As a public health, regulatory and safety agency, safety is our top priority.” Then it went on to say it would be too complex and resource-intensive to verify that systems such as Tesla Autopilot are used within the conditions for which they are designed, and it potentially would not fix the problem.

Homendy was skeptical of that explanation, saying agencies and industries frequently respond to NTSB recommendations by citing the impossibility of their requests — until additional carnage forces their hand. NHTSA said it is focused instead on ensuring drivers are fully engaged while using advanced driver-assistance systems.

In court cases and public statements, Tesla has repeatedly argued that it is not liable for crashes involving Autopilot because the driver is ultimately responsible for the trajectory of the car. After a fatal crash in 2018, Tesla told NTSB that design limits for Autopilot would not be appropriate because “the driver determines the acceptable operating environment.”

Steven Cliff, a former NHTSA chief who left the agency last year, told the Washington Post the approach taken by regulators can appear too cautious at times, but said his agency was aggressive under his watch and mandated companies such as Tesla report their data on crashes involving advanced-driver assistance-systems. But advancing from the data collection stage to a final rule, where new regulations are adopted if necessary, can take years. “Tesla’s philosophy is, let the operator determine for themselves what is safe but provide that operator a lot of flexibility to make that determination,” he said.

Autopilot Knows Where It Is

Cliff also said Tesla could easily limit where the technology can be deployed. “The Tesla knows where it is. It has navigation. It knows if it’s on an interstate or an area where the technology wasn’t designed to be used,” he said. “If it wasn’t designed to be used there, then why can you use it there?” Elon Musk once hung up on former NTSB chair Robert Sumwalt, who retired from the agency in 2021 when Homendy took over.

In 2020, NTSB issued a report on another fatal Tesla crash that cited both a truck driver who ran a stop sign and the Tesla driver’s “over reliance” on Autopilot as probable causes of the crash. NTSB also took the novel step of citing NHTSA for the first time, saying its failure to “develop a method” that would “limit the use of automated vehicle control systems to the conditions for which they were designed” contributed to the crash. In 2021, NTSB sent another letter to NHTSA about Autopilot, calling on the agency to “include sensible safeguards, protocols, and minimum performance standards to ensure the safety of motorists and other vulnerable road users.”

In one of her latest attempts to spur action, Homendy sent a letter directly to Musk in August 2021. She urged him to implement safeguards to “limit” the technology to conditions it was designed for, among other recommendations. “If you are serious about putting safety front and center in Tesla vehicle design,” she wrote, “I invite you to complete action on the safety recommendations we issued to you four years ago.” Musk never responded, she said.

Autopilot Controversy Abounds

My old Irish grandfather always claimed the most dangerous part of any automobile is the nut behind the wheel. Musk apologists like to say that Autopilot has saved far more people than have been harmed by it. Musk haters on the other hand hint darkly that Autopilot is designed to shut itself off seconds before a crash so the company can say with a straight face that the system was not active at the time of a crash. Both groups are probably partially correct.

But here’s where the rubber meets the road. The man severely injured in the Florida Keys asks, “How could they allow something like this on the road? It’s like a deadly weapon, just driving around everywhere. And people, pedestrians like me, how are we supposed to know stuff like this? It shouldn’t be allowed.”

More than once, the conversation at CleanTechnica has focused not on Tesla drivers but on the drivers and passengers in other cars — and pedestrians and bicyclists as well — who do not know they are actors in a giant computer simulation created expressly for the Great and Powerful Musk. Who speaks for them? Who protects their rights? Why are they drawn in to the Tesla story without their consent?

No doubt, our readers will have lots to say on those topics. We can’t wait to read your comments.



Chip in a few dollars a month to help support independent cleantech coverage that helps to accelerate the cleantech revolution!
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one if daily is too frequent.
Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

CleanTechnica's Comment Policy


Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and embraces the wisdom of Socrates , who said "The secret to change is to focus all of your energy not on fighting the old but on building the new." He also believes that weak leaders push everyone else down while strong leaders lift everyone else up. You can follow him on Substack at https://stevehanley.substack.com/ and LinkedIn but not on Fakebook or any social media platforms controlled by narcissistic yahoos.

Steve Hanley has 5904 posts and counting. See all posts by Steve Hanley