On Saturday night, a 2019 Tesla Model S crashed into a tree in a suburb of Houston, Texas. The crash was severe enough to compromise the battery pack, causing a fire that took hours and a call to Tesla to extinguish. Normally, a car crash, even one with a fire, wouldn’t be newsworthy because they happen all the time to vehicles of all types, but in this case, investigators said they were certain there was nobody in the driver’s seat at the time of the crash.
Harris County Constable for Precinct 4, Mark Herman, said that the vehicle was traveling fast and encountered a slight curve in the roadway, and failed to navigate it. Exiting the road, the vehicle struck a tree and caught fire. After contacting Tesla and getting more information, crews were able to successfully extinguish the fire.
Once the flames were put out and the vehicle was safe to approach, investigators found human remains in the passenger seat and in the back seat.
Herman told KHOU that their investigators “feel very confident just with the positioning of the bodies after the impact that there was no one driving that vehicle.”
Many people on social media are raising other possibilities here, though. Could the person have been climbing into the back to escape fire, and then sat down as they were overcome? Or were they thrown during the collision (this would require the vehicle to be traveling the opposite direction, or spinning)? We are going to have to wait for further information from the investigators to know for sure.
CleanTechnica did put in a request for information to the investigators, and we will let you know what they respond with.
What Happened Here?
We don’t know what, exactly, the vehicle’s occupants did. Given that the vehicle burned for four hours and everyone who was there died in the crash or the fire, there’s nobody to ask and most of the evidence is gone. We may never know for sure what happened right before the crash.
Two men dead after fiery crash in Tesla Model S.
“[Investigators] are 100-percent certain that no one was in the driver seat driving that vehicle at the time of impact,” Harris County Precinct 4 Constable Mark Herman said. “They are positive.” #KHOU11 https://t.co/q57qfIXT4f pic.twitter.com/eQMwpSMLt2
— Matt Dougherty (@MattKHOU) April 18, 2021
If you look at the image, it’s pretty clear that we won’t be able to tell if the driver’s seatbelt was clicked with nobody sitting in the seat, whether there was an autopilot defeat weight on the steering wheel (exercise weight, “Autopilot Buddy”, orange, etc.), or much else. All of that evidence was burned. Data from the event data recorder (EDR) may have survived, but it seems unlikely. Data on any of the above seems unlikely to be recoverable after so much high heat.
The only thing we know from Tesla is that Elon Musk says Autopilot was not on at the time of the crash, and the vehicle didn’t have anything beyond standard Autopilot. It did not have the FSD Beta update that could have been engaged and driving the car on such a road.
Your research as a private individual is better than professionals @WSJ!
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.
— Elon Musk (@elonmusk) April 19, 2021
Failed Attempt To Use Autopilot?
Given that Autopilot wasn’t on, one other possibility was that the men were trying to abuse Autopilot, but that it didn’t activate as planned.
The New York Times reports that the men were discussing Autopilot when they left the house, which lends some credibility to this accidental cruise control theory. Ultimately, though, we don’t know yet what happened and need more information (which we have requested from the investigators).
There Could Have Been A Driver
One other possibility is that someone was driving the vehicle, and that they managed to end up in another seat during the course of the wreck. People not used to driving fast and powerful vehicles don’t know that you have to ease into driving with that much power and get used to the vehicle’s handling characteristics if you want to avoid an accident.
While hitting a tree is very unlikely to put someone into the back seat, it’s possible that the vehicle spun and hit another tree before coming to a rest against the final tree. Given the extensive damage from the fire, the photos can’t tell us much about that.
Like the other questions, we’ve put in a request for more data on this to see if that’s a possibility.
FSD Beta? No
In this case, the car did not have FSD Beta, so there’s no chance it was at fault or relevant at all.
When crashes like this occur in the future, keep in mind that there are relatively few beta testers (1000–2000) compared to the overall population of Tesla owners. That alone makes it statistically unlikely that a person in an accident was an FSD Beta participant.
Even If Autopilot Had Been Abused, That’s Not Tesla’s Fault
It is possible to abuse Autopilot, but that doesn’t appear to be the case here. Even if it was, I don’t think it’s ethical to try to blame Tesla for that. To explain further, let’s go through what it takes to hack Autopilot and make it run without a driver.
Here’s a video explaining how people abuse Autopilot:
To do this, one could buckle the seatbelt and sit on top of it, or use a fake buckle to fool the car into thinking the seatbelt is in use. The above YouTube video shows us that Autopilot will stay engaged if it’s activated with someone sitting on the seat and stay engaged if the person leaves the seat. Even if the sensor was to prevent that, just putting enough weight on the seat (with sandbags, exercise weights, etc.) would bypass that system, too. Or, if you’re more savvy, it’s a simple switch sensor and not variable like the passenger seat, so you could just short it and make the vehicle think there’s always someone sitting in the seat.
If you start the system and then climb out of the driver’s seat on the go, you wouldn’t need to use a stick to push the pedal like the guy in the video did (he was smart and didn’t test this on an actual road).
There are numerous videos on social media with people abusing Autopilot like this, so there’s really no room for debate at this point on whether such abuse of the system is possible. It definitely is possible, but didn’t happen in this case.
What we do know is that to abuse the system, one must bypass 2 or 3 safety features, and that really puts responsibility for anything bad that happens on the driver. You can’t put the work in to bypass multiple safety features and then claim you are a victim of Tesla. They tried to keep you away from danger, and you jumped multiple fences and put yourself there anyway.
Featured image by Tesla.
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.