Yesterday, Twitter user Nick Howard asked Elon Musk on Twitter when FSD Beta 9.2 would be released. Elon explained that there had been some unexpected last-minute issues but that it should be going out the next day or two. After this, the update was released to more people (some had already received it when he responded) and Elon shared more details on the improvements of FSD Beta 9.2.
Some unexpected last minute issues. Should go out in next day or two.
— Elon Musk (@elonmusk) August 15, 2021
- Clear-to-go boost through turns on minor-to-major roads (plan to expand to all roads in V9.3).
- Improved peek behavior where we are smarter about when to go around the lead vehicle by reasoning about the causes for lead vehicles being slow.
- v1 of the Multi-model predictions for where other vehicles expected to drive. This is only partially consumed for now.
- New Lanes network with 50k more clips (almost double) from the new auto-labeling pipeline.
- New VRU velocity model with 12% improvement to velocity and better VRU clear-to-go performance. This is the first model trained with “Quantization-Aware-Training,” an improved technique to mitigate intr8 quantization.
- Enabled Inter-SoC synchronous compute scheduling between vision and vector space processes. Planner in the loup is happening in v10.
- Shadow mode for new crossing/merging targets network will help you improve VRU control.
After sharing these details, Elon was asked about a revised estimate on the FSD Button release. Elon said that it would be Beta 10 or 10.1. He also added that when Tesla went to pure vision, this set the team back for a short time. “Vision plus (course) radar had us trapped in a local maximum, like a level cap.” Pure vision, he pointed out, “requires fairly advanced real-world AI, but this is how our whole road system is designed to work: NN’s with vision.” In other words, humans use neural nets + vision, so that’s what roads are designed around.
Beta 10 or maybe 10.1. Going to pure vision set us back initially. Vision plus (coarse) radar had us trapped in a local maximum, like a level cap.
Pure vision requires fairly advanced real-world AI, but that’s how our whole road system is designed to work: NN’s with vision.
— Elon Musk (@elonmusk) August 15, 2021
Elon also shared that after FSD Beta v9.2 will come 9.3, probably 9.4, and then maybe 10. He added that there are significant architecture changes in 10. (FSD Beta version “X” sounds cooler than version 10, in my random opinion.)
Unpacking the Details
I am no AI expert, but I’ll share what stood out to me. That was the multi-model predictions for where other vehicles are expected to drive. I don’t drive, but since I grew up being a pedestrian, I’ve developed instincts. I can actually sense when a car is behind me even if I have headphones on. It’s subtle changes in the air, the nuanced feeling of someone behind me — and that sense has saved my ass a time or two. I also continuously glance around at my surroundings — again, these are habits that I’ve developed over the course of a lifetime of being car-less.
Keeping that perspective in mind, I often rely on my instincts as to when a car around me will make its move. It’s just second nature — autonomous. I can also “feel” when the lights are about to change. And you would think this is a psychic gift, but the truth is that the subconscious is continuously recording and it remembers. I don’t magically know when the lights will change, but my subconscious uses data that it previously collected from the same trips I would walk and applies that knowledge to the current trip I’m on.
I suspect this is along the lines of what Elon meant when talking about multi-model predictions for where other vehicles are expected to drive. When I am walking to the store or somewhere, my senses are always on high alert. There was that one time a guy ran a red light when I was in the crosswalk. I was not only aware of this, but my instincts urged me to slow down my pace and the car actually came to a stop in the middle of the crosswalk where I would have been standing if I didn’t slow my pace. Tesla’s FSD system needs to not only be able to see people, bikers, and cars well, but also learn better and better how to predict what they will do and how to respond to that while remaining alert about other possibilities.
The fact that Tesla is making a vehicle that is self-aware and has the same kind of senses (or will have when it’s perfected) as I do when walking is completely mind-blowing. As for the rest of the details, I agree with Dave Lee here — we need James Douma or someone like him to translate the rest. And this will be a video that I will be looking forward to.
Paging @jamesdouma… please translate for the rest of us.
— Dave Lee (@heydave7) August 15, 2021
A few days ago, the two gentlemen held a livestream to discuss Tesla FSD Beta and the future of autonomous driving. I wrote an article about the video, but it only covers a small part of it. The video is a goldmine of detailed information on a variety of aspects of AI, and I think it’s one of those that you have to watch a few times just to absorb the information. If you missed it, go back and catch up here.
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.