By Johnna Crider and Zachary Shahan
Around 6 months after Tesla’s Autonomy Day, Waymo CEO John Krafcik gave a presentation to German auto giants that he was seeking to partner with. A video clip of this presentation, provided by Third Row Tesla, shows the CEO explaining why Waymo shut down their “Autopilot program.”
On autonomy day, @elonmusk hit Waymo pretty hard revealing to the world for the first time that their project was bound to fail.
6 months later, Waymo hit Tesla back in a presentation to the German auto giants, who he seeked to partner with.
Warning: Video may anger AP fans pic.twitter.com/XUJV8wEg51
— Third Row Tesla Podcast (@thirdrowtesla) April 23, 2020
In the video, Krafcik says that Waymo had a program called Autopilot which would be a product to drive on Autobahn “on its own, hands-free driving there would be okay, but driver oversight would still be required.” Krafcik even said the name for the program was “Autopilot” and that Waymo recruited dozens of employees with long highway commutes to try it out. However, there were three conditions they had to meet.
- Pay attention at all times.
- Be alert. Hands-free driving, hands off the steering wheel, was fine, but they had to be alert.
- If someone didn’t follow the first two rules, Waymo would take the car away from them. “We would be watching them with a camera inside the car.”
The experiment, as Krafcik called it, failed. People were not paying attention. They were not looking at the road and many were very distracted. In one of the video clips, there was a male driver who was texting and driving. He wasn’t paying attention. In another video clip, the driver fell asleep. Another one showed a woman curling her eyelashes.
“You know what we did? We shut down the Autopilot project. We shut it down after just a few weeks because of what we had seen,” Krafcik said in the presentation. “Our Autopilot system was very advanced — more advanced than anything on the road today. It had radar, it had lidar, it had cameras, it had a massive onboard computer. It was so good, though, that our human drivers became too comfortable, too quickly.”
What we seem to know so far from Tesla data is that having a vehicle with Autopilot — with the level of features Tesla Autopilot now offers — is safer for the driver and those around the driver. Tesla drivers using Autopilot have a much lower likelihood of accident. There are also plenty of cases of people saying Autopilot saved them from an accident they wouldn’t have been able to avoid otherwise. Yes, some people also abuse the features and don’t pay enough attention, and there have been drastic consequences in a few of those cases. But as the old saying goes, “don’t make perfect the enemy of good.”
Since Tesla collects data on drivers and their habits, perhaps programming the car to sense when a driver is in distress is a possibility — maybe they had a stroke or fell asleep or have been distracted. There is much potential in this regard. The car could even autodial 911 in some circumstances — not right away, but maybe after certain attempts to get the driver’s attention and a chance to give the driver 20 seconds or so to cancel the 911 call, the car could attempt to save a life in another way. (Of course, this would be after automatically pulling the car over, something a Tesla can already do if a driver becomes unresponsive.)
There has been a push to use the car’s internal camera (this one) to watch whether a driver is keeping eyes on the road or has fallen asleep or something. There is no indication Tesla is working on doing this, but it’s another safety step that could further add to Tesla’s safety record and save more lives.
Similar to Waymo, NASA has found that when the system is so good there’s almost no interaction ever needed, humans basically cannot keep their attention focused on monitoring an autopilot system. Tesla’s goal is to eventually be fully self-driving. The challenge is the middle phases, including this possibility of people becoming far too complacent. Perhaps Tesla (and/or Waymo and others) could add a gamification aspect that forces drivers to keep their attention on monitoring the road but in a fun rather than nagging way. People do play all kinds of driving video games as it is — is there not a way to reward drivers for paying attention to the road in a thoughtful, gamified way and penalize them for losing their focus?
This is surely not the last of this discussion. It has been a theoretical topic of conjecture for years, but it is becoming a much more practical matter as Tesla improves its “Full Self Driving” features and regulators get nervous about this important middle stage.
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Former Tesla Battery Expert Leading Lyten Into New Lithium-Sulfur Battery Era — Podcast:
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...