Say what you want about Tesla’s Autopilot — it’s a miracle that makes driving safer or its a scam that gives drivers a false sense of security — Teslas driving with Autopilot engaged have a distressing tendency to slam head on into emergency vehicles parked on the side of America’s highways. That has happened 12 times so far and the National Highway Transportation Safety Administration wants to know why.
NHTSA has told Tesla it wants it to turn over reams of data about how Autopilot operates by October 22. According to Car and Driver, the government agency wants a list of every Tesla outfitted with the company’s self-driving technology, including which hardware and software versions are installed in each car, as well as information on every crash the company is aware of involving a vehicle equipped with Autopilot.
NHTSA has also asked for precise details of Autopilot’s operating limits, including the maximum steering angle and maximum rates of acceleration and braking. The request for information includes details about how the Autopilot system interacts with the driver, a list of situations that would cause the system to disengage, and details of how and when driver inputs can override the Autopilot functions.
The NHTSA investigation focuses on 12 crashes involving Teslas and stopped emergency vehicles. When the investigation began a few weeks ago, only 11 crashes were under investigation, but last Saturday a Model 3 collided with a highway patrol car in Orlando, Florida. The owner claims Autopilot was engaged at the time of the crash.
If NHTSA determines in its investigation that Tesla’s Autopilot system is unsafe, it could compel the company to recall cars or repair them to correct any safety defects. NHTSA has estimated that any such fix could impact up to 765,000 Teslas built between 2014 and 2021.
Looks promising that Beta 10.1, about 2 weeks later, will be good enough for public opt in request button
— Elon Musk (@elonmusk) September 2, 2021
Despite the onset of the investigation, Tesla says it is preparing to expand beta testing a new version of its Full Self Driving software. Elon Musk has even hinted that the FSD software could be made widely available via an opt-in button within the next few weeks.
Tesla is far from the only company offering advanced driver assistance technology, of course, and none of them promises more that Level 2 autonomy, at least not yet. There is every reason to believe that whatever conclusions NHTSA comes up with will have an impact on those other systems as well. The difference is that Musk and Tesla are more aggressive when it comes to touting the capabilities of its system.
Will the government force Tesla to pull in its horns a bit (or a lot) when it comes to the capabilities of its Autopilot technology? If you know anything about Elon Musk, the answer is likely to be, “No way.”