1st Real Drive With Tesla Autopilot (Including A Few “Warning Shots”)

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

I first experienced Tesla Autopilot last year, as I wrote in my Tesla Model X review. It was certainly cool and interesting, and a bit nerve-racking. But that was a short test drive on a California highway — not real use.

Well, a few guys and I recently got our first Tesla Model S for an intercity Tesla shuttle service we’re launching here in Europe (… called Tesla Shuttle), and the Tesla includes Autopilot.

The other day, one of the cofounders and I went to the city of another one of the cofounders (~2½ hours away) to pick up the Model S 85D (he went through all the trouble of getting the car leased, registered, photographed, stylized, and worn in a little bit, but the service will be based out of the city where I live). I then drove the Model S for an hour and a half or so back toward home base. On a real drive on the highway, and over a year after a few minutes of testing, I put Autopilot (version 1 hardware) to use in a real-life setting. Some of that experience stood out to me, so I felt compelled to jot it down before I forget. Sharing a few key points from this initial drive/test could benefit other Autopilot newbies down the road.

First of all, yes, Autopilot is cool. It’s almost as relaxing as a massage (okay, maybe not, but it’s more relaxing than driving yourself). It’s amazing how well it doesn’t drive the car off the road or make you feel like you’re in a pinball machine. It’s fun and exciting. And it definitely has a more futuristic feel than listening to a con man talk about the good old days of cancer-multiplying coal pollution and empathy-deficient racism, sexism, and bullying. It feels like the future society is headed into, not a past we’re trying to move away from … again.

But there were a couple of quirks that spooked me a tad bit on this longer introductory drive (which was solo).

After driving for a little while, I turned on the blinker to try to automagically change lanes. Sure enough, it worked! Cool! A split second later, however, I realized the car was trying to drive me into a pseudo-lane further to the left that was just about to turn into what would be a horrendous “driving” experience if the car made its way there.

It turns out, in my excitement watching the car work it’s magic, I didn’t think/realize that I had to turn off the left-turn signal to stop the car from changing lanes again. It also hit me that I had used Autopilot to change lanes at precisely the wrong time to also forget to turn off the signal — since a computer would see the rapidly disappearing lane on the left like an open lane rather than the disaster lane it could have been. That lane change test could have ended badly … very badly … but I was of course very attentive to what was happening (especially with this being my first attempt to use this option in the car) and I quickly grabbed the wheel when I realized the car was trying to turn into no man’s land.

Lesson learned. I hope.

Anyhow, the automagical lane change did work. However, later in the drive when I tried to change lanes like that again, it wouldn’t work, even at times when there were clearly wide openings for changing lanes. I’m not sure if the tech just wasn’t performing as it was raised its whole life to perform, if there was some factor in those situations that somehow told the car it wasn’t safe to change lanes (nothing obvious sprang to mind since almost no cars were around), or if there was some other factor at play keeping the car from changing lanes on its own at those times. I’ll experiment again in the coming days (going to and from Berlin for our next Cleantech Revolution Tour conference!), but will be extra, extra attentive and try to see if there’s more to learn there.

Another issue I had was quite a human one (or perhaps a Zach one, but I think I’m not so unique). Along with the autosteering, of course, there’s also the adaptive cruise control (which keeps the car going at a speed you want but also responds if you get too close to a car/truck/elephant in front of you). The adaptive cruise control is awesome and really makes driving more enjoyable — I think it’s my favorite obvious feature of Autopilot (there are also latent safety features I should probably appreciate more, but they only show up in certain challenging situations).

My second problem, however, relates to the combined use and then decoupling of autosteering and adaptive cruise control.

The thing is, several things can “kick you out of” autosteer (turn it off) while still leaving adaptive cruise control in place to work its Tesla magic. Grabbing the wheel to control the steering for a moment, changing lanes on my own, and perhaps other actions turned of autosteering a few times, but after that brief manual action and then returning to “normal driving,” I noticed adaptive cruise control was still working but didn’t notice that autosteer had been turned off. I didn’t realize initially that I deactivated autosteer by my brief actions. (Frankly, it’s easy to notice that autosteer is on or off, as the steering wheel icon lit up on the screen right in front of you is blue when autosteer is on but not when it’s off. But it apparently takes a little bit of time to get used to paying close attention to the color of that wheel.) In a couple of instances after a little muscle-powered steering deactivated autosteer, I started to sit back again and relax with the assumption that Autopilot was pulling off a 21st century Aladdin trick … but then I suddenly realized it was only adaptive cruise control that was active because the car started to drift out of line.

This happened a few times, and I’m sure it would be fully my fault as the driver if I really zoned out and didn’t notice the human perception error quickly, but it was still mildly discomforting, since I realized how easily a little reliance on the computer in the car made me complacent and made me feel more like a passenger than a driver. It’s just so easy to hand over control and attention to the lane markings in front of you once you get a mental serving of robotic driving.

Granted, a key lesson as driver that I hopefully learned here is to separate adaptive cruise control and autosteer in my mind and to be alert to the latter potentially being off even when the former is on, but the broader point is that: 1) anyone just getting used to Autopilot should be very aware of this potential mental trick/lapse, and 2) I do think there’s a concern in the middle period of semi-autonomous driving, before fully autonomous driving arrives, where we have to be extra attentive to keep ourselves alert at all times despite not having to do anything for stretches of driving. (Note that NASA psychologists are concerned about this threat as well, and even say it’s impossible to remain adequately alert.)

Again, I will see how my experience shifting between adaptive cruise control + autosteer and adaptive cruise control – autosteer turns out in the next long drive, but suffice it to say, I will be extra attentive to when the autosteering may be subtly deactivated.

I had one more notable observation regarding Autopilot on this long-ish drive. There was an extra cool feature of autosteer (beyond just how freakin’ well it works) that I noticed (I probably read about it before but appreciated it much more in real life). On a long but fairly soft curve on the highway, autosteer was very smoothly guiding the car, but it also apparently recognized this was a more dangerous situation than the norm. On the screen in front of me, a soft blue light started pulsing, seemingly telling me to put my hands on the wheel and/or stay alert. I did so and the pulsing stopped.

I was paying close attention anyway, but this did make me feel like the system was fairly good at alerting me to different levels of challenge and the need to stay attentive. … That said, I hardly noticed the pulsing light. I was certainly attentive to the situation before I noticed the light, and I wondered if I would have noticed the light if I had been used to driving with autosteer on and started taking it for granted more. (Note: the day of this drive was the day we published this article: “US NTSB: Tesla Model S Driver In Fatal 2016 Autopilot Crash Was Warned 7 Times During Trip About Not Having Hands On Wheel.”)

In a more challenging situation, Tesla Autopilot would start beeping and tell me more directly to take over the situation. I feel confident it would step up to the plate then, but it’s even nice to see that it is looking out for me in a non-obnoxious way when the drive is slightly more challenging than the norm.

I’m curious how my relationship with Autopilot will transform over time. Surely, I will get more used to the quirks and signals in different situations, and I will get accustomed to watching out for specific problems, but I may also get more complacent. Since I am quite concerned about the potential catastrophe of complacency + beta software, I hope I will be adequately attentive. Additionally, I plan to stop using Autopilot if I feel that I start to take my responsibility as driver too lightly. After all, under-acknowledging risks or bad habits — in the short term and long term — is a recipe for a nightmare.

More broadly, I definitely do wonder about the evolution of semi-autonomous software in the coming decade or so. Aside from the normal issues we’ve discussed at length, one interesting thing to consider is that cars like ours (which we didn’t buy from Tesla) go onto the used market and get sold to buyers without the normal orientation process from Tesla. Some buyers will carefully study how the tech works before using it. Others will “learn as they go,” which provides at least a handful of risks for the fresh owners. Let’s hope none of those risks lead to much more than having to quickly grab the wheel to keep the car on track.

Images by Tesla Shuttle


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Zachary Shahan

Zach is tryin' to help society help itself one word at a time. He spends most of his time here on CleanTechnica as its director, chief editor, and CEO. Zach is recognized globally as an electric vehicle, solar energy, and energy storage expert. He has presented about cleantech at conferences in India, the UAE, Ukraine, Poland, Germany, the Netherlands, the USA, Canada, and Curaçao. Zach has long-term investments in Tesla [TSLA], NIO [NIO], Xpeng [XPEV], Ford [F], ChargePoint [CHPT], Amazon [AMZN], Piedmont Lithium [PLL], Lithium Americas [LAC], Albemarle Corporation [ALB], Nouveau Monde Graphite [NMGRF], Talon Metals [TLOFF], Arclight Clean Transition Corp [ACTC], and Starbucks [SBUX]. But he does not offer (explicitly or implicitly) investment advice of any sort.

Zachary Shahan has 7317 posts and counting. See all posts by Zachary Shahan