Failed Attempt To Use Autopilot?
Given that Autopilot wasn’t on, one other possibility was that the men were trying to abuse Autopilot, but that it didn’t activate as planned.
The New York Times reports that the men were discussing Autopilot when they left the house, which lends some credibility to this accidental cruise control theory. Ultimately, though, we don’t know yet what happened and need more information (which we have requested from the investigators).
There Could Have Been A Driver
One other possibility is that someone was driving the vehicle, and that they managed to end up in another seat during the course of the wreck. People not used to driving fast and powerful vehicles don’t know that you have to ease into driving with that much power and get used to the vehicle’s handling characteristics if you want to avoid an accident.
While hitting a tree is very unlikely to put someone into the back seat, it’s possible that the vehicle spun and hit another tree before coming to a rest against the final tree. Given the extensive damage from the fire, the photos can’t tell us much about that.
Like the other questions, we’ve put in a request for more data on this to see if that’s a possibility.
FSD Beta? No
In this case, the car did not have FSD Beta, so there’s no chance it was at fault or relevant at all.
When crashes like this occur in the future, keep in mind that there are relatively few beta testers (1000–2000) compared to the overall population of Tesla owners. That alone makes it statistically unlikely that a person in an accident was an FSD Beta participant.
Even If Autopilot Had Been Abused, That’s Not Tesla’s Fault
It is possible to abuse Autopilot, but that doesn’t appear to be the case here. Even if it was, I don’t think it’s ethical to try to blame Tesla for that. To explain further, let’s go through what it takes to hack Autopilot and make it run without a driver.
Here’s a video explaining how people abuse Autopilot:
To do this, one could buckle the seatbelt and sit on top of it, or use a fake buckle to fool the car into thinking the seatbelt is in use. The above YouTube video shows us that Autopilot will stay engaged if it’s activated with someone sitting on the seat and stay engaged if the person leaves the seat. Even if the sensor was to prevent that, just putting enough weight on the seat (with sandbags, exercise weights, etc.) would bypass that system, too. Or, if you’re more savvy, it’s a simple switch sensor and not variable like the passenger seat, so you could just short it and make the vehicle think there’s always someone sitting in the seat.
If you start the system and then climb out of the driver’s seat on the go, you wouldn’t need to use a stick to push the pedal like the guy in the video did (he was smart and didn’t test this on an actual road).
There are numerous videos on social media with people abusing Autopilot like this, so there’s really no room for debate at this point on whether such abuse of the system is possible. It definitely is possible, but didn’t happen in this case.
What we do know is that to abuse the system, one must bypass 2 or 3 safety features, and that really puts responsibility for anything bad that happens on the driver. You can’t put the work in to bypass multiple safety features and then claim you are a victim of Tesla. They tried to keep you away from danger, and you jumped multiple fences and put yourself there anyway.
Featured image by Tesla.
Sign up for CleanTechnica's Weekly Substack for Zach and Scott's in-depth analyses and high level summaries, sign up for our daily newsletter, and follow us on Google News!
Whether you have solar power or not, please complete our latest solar power survey.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one on top stories of the week if daily is too frequent.
Advertisement
CleanTechnica's Comment Policy