During Tesla’s 3rd quarter conference call last night, Elon Musk got a very important question regarding the widespread skepticism that Tesla will be able to roll out “feature complete full self driving” by the end of this year (only 68 days from now). This questioner asked him to define what “feature complete” means and give a list of features that establish a baseline. I’m going to summarize, quote, and expand on how he answered it.
First, though, somewhere else in the call, he said it probably will only make it to Early Access customers by the end of the year, not to regular customers. He described the problem of self driving as 3 separate use cases. Those 3 use cases then have to go through 3 stages of development and approval. I’ll give you a combination of his answer on these use cases with considerable commentary from my experience.
3 Use Cases For Full Self Driving
1. High-speed driving. This is highway driving. Anyone who has been using Autopilot and Navigate on Autopilot (when you put a destination in the GPS) has seen it go from pretty good to excellent in the last couple of years. It has gotten much smoother and handles the strange corner cases much of the time. Many people can see this getting good enough that it could be safe to not monitor the car pretty soon. It won’t be 100% safe, but Elon has been clear that it would be irresponsible and cost many lives if we don’t allow software to be used until it is perfect or close to perfect. He has said many times that once it is 200% safer than a human (I assume an average human, not the best driver in the world), he will push hard for it to be approved by the regulators in various regions.
2. Low-speed driving. This means really low speed, like 5 miles per hour. This is what Smart Summon and the upcoming Smart Parking features are all about. The car is still monitored by the driver, but now from outside the car instead of inside the car. Elon claims Smart Summon will get much better in a few weeks, learning much from the corner case experiences of 1 million uses since it was released less than a month ago. Most people would say this isn’t as mature as the high-speed driving software, but they would also admit that if there was an error, it would probably just scrape a curb or dent a fender. It is unlikely an accident at this speed causes an injury or death. [Editor’s note: My overarching thought on Smart Summon is that it’s biggest weakness is it’s far too cautious, but it still needs to be. If I try to use it in an area that is somewhat busy, it often stalls too much because of people walking by or curbs or other cars. People don’t want to sit in a parking lot for a few minutes as a self-driving Tesla skittishly moves forward. I’m hopeful Elon’s repeated comments about making it much smoother will solve the bulk of this problem, but I do wonder how it does so while continuing to be hyper cautious so as to not get on the front page of the New York Times. We’ll see. —Zach]
3. Medium-speed driving. This is driving around the city and stopping for stop lights, stop signs, and watching for all the urban chaos, like pedestrians and bikes coming from unexpected places. We have seen a few controlled videos showing this working, but nobody outside of Tesla has been allowed to test this yet. This is expected to be released to early access testers in the next 68 days! I assume this stage was left to last for two reasons.
- I think it is the toughest use case and they didn’t want to tackle it until they had the other 2 cases working pretty well. This way they can borrow software from those 2 cases to do much of this work. They need new software to recognize lights and signs (that has been running in the background for a long time). They also need logic to decide when to brake and accelerate and take evasive maneuvers in a city environment.
- Some of this enhanced software and image recognition requires the new Full Self Driving computer that Tesla started putting in new cars this spring. Tesla also started upgrading existing cars (for people who paid between $2,000 and $6,000, soon to be $7,000, for the FSD hardware and software upgrade). Since Tesla didn’t have this in many cars until recently, it didn’t make much sense to release the features. I don’t think this reason is as important, but it might have been considered. If the software was ready, Tesla could have sent it to the early access people 6 months ago, though, so I think the software just wasn’t ready.
So, those are the 3 use cases. But then each of those has to go through 3 levels of development.
Development Gates For Full Self Driving
I develop software every day, so when Elon described the 3 steps/stages/gates in the timeline, it reminded me of my day job.
1. The major features are all there. I would say the car can handle low-speed driving and high-speed driving today — they just need to release the medium-speed features (or city mode Navigate on Autopilot) and this stage is complete. Elon was clear the software doesn’t have to work on every road and it doesn’t have to handle every corner case. The driver is fully responsible at all times. Elon expects this to start rolling out by December 31, 2019. It will be close.
2. No driver monitoring genuinely needed, but still required legally. All features are safe enough that Tesla feels the driver doesn’t need to monitor the car, but at the same time there hasn’t been enough data collected to convince the regulators. The driver is fully responsible at all times as a result. Elon expects this in mid-2020. I expect it to be somewhat later.
3. Regulators approve. The software no longer requires the driver to monitor the car in the region the regulator approves. Liability for accidents shifts from the driver to the car. This is a watershed moment in history. Elon expects this in 2020. I expect it to be somewhat later. [Editor’s note: I think the most interesting thing here is that one specific region that wants to be first could — maybe — really be ready to jump in and approve FSD/robotaxis basically as soon as Tesla thinks it’s ready. I’m fascinated to see which regions are first and how long they take to approve it. I agree with Paul that it’s a watershed moment and 2020 seems too early, but it’s a watershed moment even if it’s in a small geographic area, because then real-world data from real robotaxis start to get collected and other bullish regions that are just slightly more cautious can watch what happens in region #1 for a relatively short time and then jump in. Could it be Dubai? Florida? Some area of China? We’ll see. —Zach]
Q3 Safety Update
In addition, Tesla gave us a taste of the 3rd quarter safety numbers. I expect Tesla to publish a full update here soon.
From Tesla’s Q3 2019 shareholder letter (Page 8):
“During Q3, we registered one accident for every 4.34 million miles driven in which drivers had Autopilot engaged. This compares to the national average of one accident for every 0.5 million miles based on NHTSA’s most recent US data.”
To put that in context to the other four quarterly Tesla Safety Releases, the miles per accident has ranged from 2.87 million miles in Q1 of 2019 (possibly higher due to winter weather in much of the US) to 3.34 million miles a year ago in Q3 of 2018. This is a 23% decrease in accidents per miles driven with Autopilot engaged in a single year!
This means if you could drive all of the average miles someone in the US drives (13,476 miles) all on Autopilot (you can’t), you would have a 1 in 322 chance of having an accident in a year, or over an entire driving lifetime of 60 years (age 16 to 76), you would have a 1 in 5 chance of being in an accident (which wouldn’t necessarily be serious).
Compared to the accident statistics for all cars in the US, where there is an accident every 500,000 miles (which isn’t an apples-to-apples comparison because that includes city driving, which you can’t always use Autopilot for but is where accidents are more likely). In this case, people have an accident every 37 years and would have an average of about 2 accidents in their lifetime.
That makes driving a Tesla on Autopilot almost 9 times safer than driving an average car (with the caveat that it isn’t a fair comparison). A year ago, driving on Autopilot was almost 7 times safer than driving an average car (with the same caveat). So, Tesla is making slow but significant progress.
Six months after Elon’s interview with Lex Fridman, where he exuded confidence that Tesla will win the race to make self-driving cars, Elon is still confident he can do it and he has delivered the second piece of the puzzle (low-speed driving). On the other hand, he has had to admit that the feature-complete Full Self Driving update will probably only be available to early access customers and not the whole fleet this year.
My opinion is unchanged from 6 months ago. I think Tesla can do it and I think the company will do it before anyone else, but I think it will take longer than Elon has predicted. Maybe 6 months, more likely 1 to 2 years longer, for him to get it approved by regulators. So, I’ll speculate that I think a major region (like a US state) will approve Full Self Driving by the end of 2021 or 2022.
Use my Tesla referral link to get 1,000 miles of free Supercharging on a Tesla Model S, Model X, or Model 3 (you can’t use it on the Model Y yet). Now good for $100 off on solar, too! Here’s the link: https://ts.la/paul92237
Appreciate CleanTechnica’s originality? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.