Tesla FSD 12.3.3 Drive Without Intervention … Almost

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

With Tesla’s “Full Self Driving (Supervised)” package working much better and more naturally, our goal is to do a weekly video using it and trying to document how it is progressing (or not — whatever the case may be this time around). I may switch to doing live videos for a variety of reasons — you can 100% trust what you’re getting, it’s more exciting, it’s less time consuming since I don’t have to edit(!), and we could potentially build in live commentary and interaction to influence the testing. However, for the time being, we’re going with what we’re more familiar with.

In this latest drive, the first 17 minutes or so is from one continuous segment. Then there’s another approximately 8-minute segment. I am going to write about three portions of the video below the embedded YouTube player, but you can watch the full video for all of the details and nuance. I provide a lot of commentary through the video in order to try to communicate and explain everything I thought about FSD’s approach to driving and how it’s different from mine or the same as mine. (Note: In the next video, we should again return to David Havasi driving, but we will again try to log every note verbally.)

So, the one thing that caused me to disengage after about 17 minutes is the following: I was in a left turn lane at a red light, as I should have been. It was actually one of three left turn lanes, and was the one furthest to the right. I had been sitting there for a while (because the light was red), and then the car started edging to the right a little bit and the steering wheel started turning rapidly as if it was going to leave the turn lane and get into the lane on the right of me that was going straight. I don’t know why it started doing this — maybe it was getting impatient and was thinking to go another way (we had been at the red light for about 30 seconds) — but it unnerved me and confused me, and since that was definitely not what I wanted the car to do, I disengaged. Then I started FSD again (there’s a brief cut in the video there), but after a moment, FSD started to do this again, so I disengaged and did not re-engage it until after I got to my destination and was ready to go to the next one. After thinking about it more, I still have no idea why it started doing that.

A little bit before that, there was an area of the highway where an I-75 overpass goes over the high-traffic road I was driving on. I’ve had sudden braking in that area in the past (not version 12), and I had a little bit of phantom braking right before that just from a sort of big sign over the road or shadow (13:09 in the video), so I decided to put my foot on the accelerator in that area just to make sure there wasn’t any significant phantom braking in a dangerous situation. That was the only other intervention in that 17-minute drive before I canceled FSD at the red light for the reason noted above.

A third unusual occurrence in this video came at 18:45. I was driving through a big shopping center parking lot and a car decided to pull up next to the curb facing the wrong way and blocking me, and putting on its emergency blinkers to show it wasn’t going anywhere. To make matters worse, there was a ballard in the middle of the road, between the two lanes going in opposite directions, so the car would have to go fully into the wrong lane in order to go around this car. Insane human behavior and very disruptive, and I almost decided to just disengage and handle it myself, but I left the car to try to solve the problem itself — and it did! It perfectly handled the situation and got us back on track. Super impressive. I guess this is the kind of situation where FSD currently shines.

The only other issue I would briefly note from that drive came earlier in the video. The car moved a bit too far to the left of the lane for my liking, like it’s done before (one time, on I-75, that led to the car actually crossing the lane marking and straddling it for a while inexplicably until I disengaged). For a moment, it also then drove a little too far to the right. I don’t know why it did this, but it wasn’t extreme enough in this case to disengage — but it nearly was, and if I was just using it for my own convenience, I definitely would have disengaged, and I wouldn’t even use it because of these things.

At the end of the second video, I also fully disengaged, but that was accidentally because I was apparently holding the wheel too tightly as it was starting to turn. To be clear, though, I’m not sure how I or the car would have handled that situation if I hadn’t accidentally disengaged.

Is one or two interventions good? Of course, in many regards, it’s amazing tech and it insanely good. However, from the perspective of this being used for robotaxis, we are far from that in my opinion. This is just one car and just a 17-minute drive, or 25 minutes if you add in the other segment. It needs to be 1 in a billion. Yes, people could say these weren’t necessary interventions. I don’t know. They made me uncomfortable and unsatisfied enough to step in. Judge it how you will, but there are plenty of other cases where people definitely did need to step in or should have done so. The technological progress is amazing, but it’s also still very limited. Stay tuned for more on that in a separate article about a different scenario a few days ago, and stay tuned for more Tesla FSD Supervised testing!

Any questions or other comments?


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica.TV Video

Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Zachary Shahan

Zach is tryin' to help society help itself one word at a time. He spends most of his time here on CleanTechnica as its director, chief editor, and CEO. Zach is recognized globally as an electric vehicle, solar energy, and energy storage expert. He has presented about cleantech at conferences in India, the UAE, Ukraine, Poland, Germany, the Netherlands, the USA, Canada, and Curaçao. Zach has long-term investments in Tesla [TSLA], NIO [NIO], Xpeng [XPEV], Ford [F], ChargePoint [CHPT], Amazon [AMZN], Piedmont Lithium [PLL], Lithium Americas [LAC], Albemarle Corporation [ALB], Nouveau Monde Graphite [NMGRF], Talon Metals [TLOFF], Arclight Clean Transition Corp [ACTC], and Starbucks [SBUX]. But he does not offer (explicitly or implicitly) investment advice of any sort.

Zachary Shahan has 7399 posts and counting. See all posts by Zachary Shahan