In this article, I explain the reason why I don’t think Tesla Full Self Driving (FSD) will work for many years, based on 5 years of using the product.
My Experience With Tesla Autopilot And FSD Beta
I’m a data engineer and I’ve been developing software for 40 years for major Fortune 500 companies. I have very little experience in artificial intelligence (AI), so keep that in mind as you read this article. I could be 100% wrong. My family has 3 copies of the FSD Beta software and I use it every day to drive in Florida or Colorado. I’ve used Autopilot or FSD Beta for 5 years and have seen little improvement on the highway and some improvement in the city during that 5-year period. The product isn’t getting exponentially better — it keeps hitting issues that take years to resolve or are never resolved. If an intersection in Tampa is changed, FSD doesn’t realize that 2 years later and I have to disengage every single day in the same spot. Even though thousands of cars have gone through that intersection millions of times, the car still thinks this lane is a left turn lane and it is not. Any 14-year-old could read the signs and tell that.
The car has no local learning capability, and no ability for the owner to override what FSD thinks without disengaging. Compare this with teaching a 14 year old to drive. You can tell the 14 year old that this is not a turn lane once, and they will remember that and not make that same mistake a thousand more times. A 14-year-old drives fine without a billion dollars in compute and without needing to study a billion miles of data. They just learn a few rules and figure the rest out on their own. This 14-year-old can’t pass the bar exam (like ChatGPT can) or beat a grand chess master (like AlphaZero can), but they can safely drive a car.
A Few Issues I See
First is that when the road differs from what the car’s maps tell it, it trusts its maps, not its eyes. I remember when I first used a GPS 15 years ago and followed the GPS instructions too literally and drove into a ditch. I learned my lesson, but FSD has not yet learned to trust what it sees instead of what the maps tell you it should see.
Second, the car has no ability to learn locally. It is like a Borg from Star Trek that is controlled by a hive mind. What is the major difference between capitalism and communist or command economies? Decentralized vs. centralized decision making. Why did China move to mostly decentralized decision making and incentives 50 years ago? Because millions were starving when the farms were all state owned and centrally controlled. When they let the workers own the farms and businesses, China saw the greatest economic boom in the history of mankind. Back to Tesla — it is still doing all of its learning in one central location. That might not even be the right design — maybe the car needs to learn locally as the 14-year-old does.
Third, there also is no thought yet given to the car–human interface. When the car is making a left turn into traffic, the driver needs to watch the traffic and he has no information on whether the car is going to go in the gap of the traffic or not. Now, maybe the screen says something, but I don’t know because it isn’t safe to look at the screen during this maneuver. This is probably because Tesla is so optimistic that this will all be working in a few more months that there is no reason to waste time on involving the driver in the system. I wonder how long before they realize this is an important function they need to add if they can’t get FSD working.
Fourth, the software doesn’t have parking lots figured out. FSD can’t park and FSD can’t unpark. If you have ever navigated parking lots around Christmas, you will realize this is not a simple problem to solve. You have people waving to you to take that space and you need a way to thank people that are courteous to you. Once again, we don’t have any of these features even in beta. It seems like an external display would need to be added to signal other cars, possibly one in the front and one in the back.
Fifth, from watching all the zero-intervention drives on YouTube, it seems like FSD works significantly better in California and Texas than the rest of the country. I assume this is because it is mainly trained there and they update the maps if they are wrong. If a 14-year-old learns to drive in California, they can drive in any state with little adjustment, but FSD seems to have trouble adapting to other states. I assume the reason Elon doesn’t realize this is he doesn’t use FSD outside of Texas and California. I assume FSD is overtrained in those locations and works fairly well there.
I’m sure some employees have tried to tell Elon some or all of the points I bring up, but for some reason or another he dismisses those concerns or has reasons they aren’t valid.
I think there is a significant chance that FSD won’t be solved for 10 years and this will be like Facebook wasting billions on the Metaverse before realizing it is on the wrong path.
Now, as a Tesla investor, and because I think FSD can save lives if we can get it to work, I hope everything I said in this article is wrong and all just a big misunderstanding.
If you want to take advantage of my Tesla referral link to get Reward Credits, here’s the code: https://ts.la/paul92237 — but as I have said before, if another owner helped you more, please use their link instead of mine. If you want to learn more about Tesla’s new referral program, Chris Boylan has written an excellent article on it.
Disclosure: I am a shareholder in Tesla [TSLA], BYD [BYDDY], Nio [NIO], XPeng [XPEV], Hertz [HTZ], and several ARK ETFs. But I offer no investment advice of any sort here.
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
EV Obsession Daily!
Tesla Sales in 2023, 2024, and 2030
CleanTechnica uses affiliate links. See our policy here.