The “False Positive” Conundrum — Are Semi-Autonomous Cars Safe If Humans Don’t Listen?

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Yet another Tesla has crashed into the back of an emergency vehicle that was stopped on a California highway. At 11:00 am on May 29, police in Laguna Beach, California, tweeted, “This morning a Tesla sedan driving outbound Laguna Canyon Road in ‘autopilot’ collides with a parked @LagunaBeachPD unit.” The police officer at the scene was away from the vehicle at the time of the crash. The driver of the Tesla suffered minor injuries according to a report in The Mercury News.

Tesla autopilot crash
Credit: Laguna Beach PD

Tesla has reiterated its claim that people using Autopilot must remain alert and in full control of their vehicles. What else can be expected? “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times,” a spokesperson for the California company said. “Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and before a driver can use Autopilot, they must accept a dialogue box which states that ‘Autopilot is designed for use on highways that have a center divider and clear lane markings.’”

Take a look at the photo above. Do you see any center lane divider in that picture? Perhaps if police started handing out moving violations that carry significant fines for using Autopilot improperly, drivers would start to get the message. If insurance companies decided not to cover repairs for cars damaged by owner negligence, that would send a strong message to the Tesla community as well. Of course, risk of death should also be a strong deterrent.

The False Positive Conundrum

The issue isn’t that software engineers are unable to program a self-driving car to avoid stopped vehicles. They can. The issue is that if they do so, there is a substantial risk that the cars will go into emergency braking mode every time their computer “sees” a highway sign above or in certain locations along the side of the road. In order to avoid “false positives,” the computer is programmed to ignore stationary objects, including vehicles.

As Wired pointed out last January, other companies with semi-autonomous driving systems suffer from the same weakness. For instance, in the Volvo owner’s manual for cars equipped with its Pilot Assist system, the Swedish company warns if a car ahead swerves to avoid a stopped vehicle, “Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.” Since the system is designed to maintain a preset speed and since it is also programmed to ignore stationary objects, the car will attempt to speed up even as it hurtles headlong toward certain disaster.

Is Lidar The Answer?

The Guardian reports many experts believe lidar sensors are critical to creating advanced autonomous driving systems that avoid the false positive problem. But they are expensive and, because today’s sensor have a limited field of vision, several will be needed on every vehicle to not only see directly ahead but also above and below a driver’s typical line of sight.

Cost is a significant issue. It appears that Uber reduced the number of lidar units in its self-driving vehicles to save money. The limited number of sensors may have been a contributing factor leading to the death of a pedestrian who was struck and killed by an Uber test car in Tempe, Arizona, earlier this year.  The head of Mobileye told Reuters recently that full Level 5 self-driving systems will probably cost about $12,000 per vehicle. On the flip side, during the Q4 Tesla earnings call last February, Elon Musk called lidar “expensive, ugly, and unnecessary.”

As consumer groups call for Tesla to stop using the name Autopilot, which they say implies the system can do more than it actually is capable of, perhaps the proper response is to start focusing on clueless humans who insist on doing things after being told specifically not to do them. The fact that there are more than 40,000 highway fatalities a year in the United States today means there is little reason to believe system error is always the cause of highway collisions where Autopilot or other semi-autonomous technologies are involved.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and doesn't really give a damn why the glass broke. He believes passionately in what Socrates said 3000 years ago: "The secret to change is to focus all of your energy not on fighting the old but on building the new." You can follow him on Substack and LinkedIn but not on Fakebook or any social media platforms controlled by narcissistic yahoos.

Steve Hanley has 5456 posts and counting. See all posts by Steve Hanley