It’s been a tough month for those who welcome the advent of self-driving vehicles. First, an Uber test car struck and killed a pedestrian in Tempe, Arizona. The subsequent investigation reveals details about the company’s autonomous driving system that suggest Uber may have elected to reduce the number of lidar sensors in its system in order to save money. And Tesla has issued an update regarding the horrific crash that killed a Model X driver in California last week that loops in semi-autonomous transport.
Autopilot Was Engaged
In a blog post dated March 30, Tesla said: “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
Engadget reports the victim’s brother has told the press the victim had complained several times about his car behaving erratically at this same junction point in the highway. The deceased was employed as an engineer at Apple and drove to work along the same route every day. He reportedly had taken his car to a Tesla service facility on more than one occasion to address the issue but the staff there were unable to find anything wrong with the car.
Investigators from the National Transportation Safety Board (NTSB) will review the electronic data recovered from the car to determine with more precision what happened in the moments before the crash. Such investigations typically take 12 to 18 months to complete. Tesla’s blog post pointed out a crash attenuator at that point in the highway had been destroyed in another accident 11 days earlier. The damaged equipment had not been replaced at the time of this accident. It then went on on to extol the virtues of its Autopilot system.
The next section is the second half of Tesla’s statement in response to the incident (subtitle added).
Tesla Defends Autopilot
Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.
In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.
Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.
No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.
In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.
Are Self-Driving Systems Safe?
There’s no doubt more details about this tragedy will be made available as the investigation progresses. In the meantime, the debate will continue about whether Tesla should be allowed to equip its cars with self-driving systems that are far from perfect. Wired sums up the current situation this way: “Engineers are convinced that taking the easily-distracted human out of the driving equation will cut down on the 40,000 road deaths each year on American roads. But right now, the systems aren’t sophisticated enough to operate without human oversight, which is difficult to ensure. And that leaves everyone in a difficult middle ground — a no man’s land with no obvious or immediate route out.”
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.