The National Highway Traffic Safety Administration needs some extra help with its investigation into Tesla’s Autopilot system. Reuters reports that the agency sent letters to General Motors (GM), Toyota, Volkswagen, and Ford with questions as it “conducts a comparative analysis with other production vehicles equipped with the ability to control both steering and braking/accelerating simultaneously under some circumstances.”
The agency asked a total of 12 automakers to list any crashes that happened during which an advanced driver assistance system was engaged “anytime during the period beginning 30 seconds immediately prior to the commencement of the crash.”
The letters also want details on the competitors’ driver assistance systems in order to ensure that drivers are paying attention and to see how they can tell when drivers are actively engaged. The NHTSA gave some of the automakers various deadlines to respond. Some have until November 3 and others have until November 17.
The agency also wants the automakers to explain in detail “strategies for detecting and responding to the presence of first responder/law enforcement vehicles.”
The Reuters headline “U.S. asks 12 automakers for assistance in Tesla probe” sounded silly at first. It leads us to assume that it’s bringing in other automakers to help investigate Tesla.
But if the people and agency investigating Tesla are 100% fair (unbiased toward any of the automakers and don’t have an unnatural hatred for Elon Musk), then I think we will actually have a chance to see a detailed and useful study here. Each of the automakers will presumably provide data as to how their drivers interact with their ADAS and how many accidents actually happen with ADAS activated.
As for the latter, the agency will be able to provide a fair assessment of the actual causes of those accidents involving an engaged ADAS system. This will determine if the automakers (all of them, not just Tesla) created flawed software that causes fatal accidents or could do things to prevent rare but still real and horrible accidents. I still lean toward the problem being a small number of drivers not using the software responsibly.
In the case of Tesla, if the software was truly flawed in a generally dangerous way, we would have far more fatal accidents and Tesla crashes would constantly being the center of attention. There would be widespread reporting on accident after accident. Instead, we have more incidents where Autopilot has saved lives than killed people and a rare accident that gets blown out of proportion — as well as accidents in which Autopilot wasn’t involved being misreported as accidents in which Autopilot was involved.
Another good thing that can come from this is if the NHTSA does find flaws and find ways to help each automaker create solutions that reduce accidents and deaths further. All 13 automakers involved would benefit from this.