Connect with us

Hi, what are you looking for?

CleanTechnica
Tesla’s Autopilot found partly to blame for 2018 crash on the 405
A Tesla rear-ended a fire truck that was parked because it was responding to an accident on the 405 Freeway in Culver City on Jan. 22, 2018. (Culver City Firefighters Local 1927)

Cars

NHTSA Upgrades Tesla Investigation Into Emergency Vehicle Crashes

Are the oranges really to blame?

The NHTSA investigation into Tesla’s autonomous and semi-autonomous driving tech has been upgraded to an “engineering analysis” following reports that six additional crashes involving Teslas and first-responder emergency vehicles had been added to the scope of the investigation in another sign of increased scrutiny of the electric car brand.

The probe, which was initiated last year, initially followed 11 crashes with stationary first responder vehicles since 2018. Those 11 crashes resulted in 17 injuries and one death — but, as of June 10th (Friday), that seems to have been expanded, with NHTSA reportedly having identified and added 6 more incidents that have occurred since then to the official investigation. It’s believed that those crashes led to 15 injuries (including one fatality), and that they were added in part because they represented cases in which Autopilot “gave up” control of the car before the impact. It’s worth noting, though, that Automatic Emergency Braking seems to have intervened in at least half of those crashes.

That’s not just random speculation, by the way. That’s from the NHTSA itself:

“The agency’s analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.”

The agency also revealed some details into the specific crashes, showing that — in most cases — they had reason to believe that the Tesla drivers would have been able to see the stopped emergency vehicles an average of 8 seconds prior to the impact, but that none of them took evasive action on their own. That’s baffling to me, but you can read the report and let me know if you interpret it differently:

“All subject crashes occurred on controlled-access highways. Where incident video was available, the approach to the first responder scene would have been visible to the driver an average of 8 seconds leading up to impact. Additional forensic data available for eleven of the collisions indicated that no drivers took evasive action between 2–5 seconds prior to impact, and the vehicle reported all had their hands on the steering wheel leading up to the impact. However, most drivers appeared to comply with the subject vehicle driver engagement system as evidenced by the hands-on wheel detection and nine of eleven vehicles exhibiting no driver engagement visual or chime alerts until the last minute preceding the collision (four of these exhibited no visual or chime alerts at all during the final Autopilot use cycle).”

The way I read this — and I know I have to tread lightly here — is that many of these cases involved inattentive drivers who were abusing the Tesla Autopilot feature. Which, frankly, isn’t a new thing. Below is one idiot who kept hanging out in the back seat of his Tesla while on Autopilot, despite having been previously arrested for hanging out in the back seat of his Tesla while on Autopilot. (!) See for yourself.

Backseat Tesla Drive Unapologetic After Arrest

This is abuse, sure. This isn’t what these systems are intended for. But the argument here is somewhat nuanced. It’s not, as some Tesla/Elon Musk defenders might say, about manipulating statistics or calling out supposed edge cases when a Tesla drives itself into an airplane. It’s about whether or not Tesla has done enough to prevent the abuse of those systems by cognitively impaired drivers (read: high/drunk) or even dubiously competent in-duh-viduals.

The NHTSA Probe Doesn’t End There

Tech magazine Engadget is also reporting that, while the NHTSA’s probe into Tesla’s self-driving tech is focused on emergency vehicle crashes, it isn’t limited to them. The agency is looking into 191 additional crashes not involving first responders. In at least 53 of those incidents, the agency found the Tesla driver to be “insufficiently responsive” as evidenced by them not intervening when needed.

To me, that seems to suggest that even drivers who the systems believe to be complying with directions to have their hands on the wheel at all times are not necessarily paying enough attention to the world around them to keep themselves — and others — safe. And, of course, you have dozens of companies selling “Autopilot hacks” to defeat the “hands on wheel” warnings, and even people like this oxygen-thief who are blasting down the road at 86 mph with a piece of citrus manning the helm.

Hacking Tesla Autopilot | ORANGE TRICK

In addition to upgrading the probe’s engineering analysis’ status, the Wall Street Journal is reporting that the NHTSA investigation has been expanded under the guidance of Transportation Secretary Pete Buttigieg, and now covers 830,000 units. Or, to put it another way, nearly every Model S, 3, X, and Y that the company has sold in the United States since 2014. As WaPo explains, an engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year if there will be a recall or if the probe will be closed without further action.

Either way the NHTSA’s investigation goes, however, I’ll close this out by advising you all to stay safe out there. You never know when you’re about to be rear-ended by some literal fruit driving a Tesla!

NOTE: an NHTSA 2016 investigation into Tesla Autopilot concluded that crash rates were reduced by 40% in vehicles equipped with the technology. Tesla itself cited agency’s findings in its own marketing, until the study was later retracted as having been based on fundamentally flawed data.

Related story: New Research Finds Tesla Drivers 50% Less Likely To Crash

 
 
 
Appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.
 

Don't want to miss a cleantech story? Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
 

Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Advertisement
 
Written By

I've been involved in motorsports and tuning since 1997, and have been a part of the Important Media Network since 2008. You can find me here, working on my Volvo fansite, riding a motorcycle around Chicago, or chasing my kids around Oak Park.

Comments

You May Also Like

Cars

Tesla’s stock has remained a polarizing topic, especially as the company’s stock price dropped immensely throughout the last year. Bears and bulls are trying...

Cars

Following yesterday’s earnings call, shareholders of Tesla stock are pleased to see a “comeback” after a tough few months. Shares surged over 10% in...

Cars

In this article, I will explain how many households don’t have access to the US EV tax credit and some possible solutions. 81 Million...

Cars

On Tesla’s Q4 2022 and full-year 2022 conference call for shareholders (still occurring as I’m typing this), we received updates on Tesla Cybertruck production...

Copyright © 2023 CleanTechnica. The content produced by this site is for entertainment purposes only. Opinions and comments published on this site may not be sanctioned by and do not necessarily represent the views of CleanTechnica, its owners, sponsors, affiliates, or subsidiaries.