Tesla Autopilot Mystery Solved — HW3 Full Potential Soon To Be Unlocked
In June 2019 after Tesla’s Autonomy Investor Day, we did a deep dive into Tesla’s HW3 chip that explored the various capabilities and potential of the HW3 processor system on a chip (SoC). I may have geeked out a little and made that a bit too technical, so I will try not to repeat that mistake in this article. Long story short, HW3 is a total beast. It is very different from the NVIDIA chip Tesla was using in the previous generation. So, it was very surprising when Elon said that there was no big rush to retrofit existing cars because “right now” (back then) you would not notice much of an increase in performance over HW2.
For the next bit, you are going to need to take a good look at the image below, so take a moment to study it and return to it if necessary.
So, retrofitting HW2 to HW3 in H2 2019 does not improve performance. What this likely means is that Tesla pretty much just took the existing Autopilot software designed for HW2 and emulated it to run on HW3. Now, for those of you unfamiliar with emulation, a good explanation is the movie Inception, but for computers. Imagine a Windows 10 computer running the Android operating system in a window — basically, Android is a program on the computer, not the operating system.
In this case, HW3 emulated HW2 to get the existing Autopilot software to function. The only problem is, HW3 is supposed to run most tasks not on the processor or graphics card but on its Neural Processing Units (NPUs), which are not designed for direct software emulation and are probably not capable of that. In principle, the Graphics Processing Unit (GPU) and the Central Processing Unit (CPU) together are capable of emulation. However, the CPU and GPU components of HW3 are less powerful than the ones in HW2, so they physically cannot directly emulate it. So, Tesla did move some tasks to the NPUs to make it work — but Autopilot needed a major rewrite of the base code to truly unlock HW3’s potential.
On a side note, HW3’s CPU and GPU are unnecessarily powerful for the minor tasks an NPU is not capable of. What this likely means is that those components were chosen to allow Autopilot to transition from HW2 to HW3. HW4 will likely have a much smaller GPU and CPU and either make more room for even more complicated neural nets plus higher-resolution/frame-rate cameras or simply reduce the power requirements of the SoC.
Thanks to Third Row Podcast’s interview with Elon Musk, we now have confirmation of the above theory and some really juicy new details. All these months, Tesla was rewriting Autopilot’s base code behind the scenes and will soon(ish) push that update to all vehicles running HW3. This could even signal the end to major updates for HW2 and HW2.5 Autopilot systems.
Making the Neural Networks Work Collectively
The next thing that we find out is what kind of changes under the hood Tesla has been allocating those extra neural nets to. Basically, the first of two major improvements is intertwining the different systems and decisions and making the neural networks work collectively. In other words, the car will be better at predicting that A results in B rather than observing A and reacting only after seeing B. We made a short 30 second clip for the occasion.
360° Camera View
The other change pertains to how Autopilot looks at the world and interprets information from the cameras. Elon once described a human driver as 2 cameras on a gimbal powered by a supercomputer — so, that’s the eyes, neck, and brain. Here is how to visualize how Autopilot works now: imagine a person is sitting behind the desk. He is tasked with drawing on a blank piece of paper a layout with the positions and trajectories of all the cars around your vehicle by looking at 6 different screens that are positioned in front of him — hard work. Now, the new Autopilot system is one camera, a 360° camera. People sometimes joke that you need eyes in the back of your head, but imagine being able to see in 360 degrees and fully comprehend in your vision all that happens around you intuitively. So awesome. Well, that is how the new Autopilot system works. It stitches together the data from all the cameras into one 360° camera. This should significantly improve the system’s ability to learn from driving experience.