
Yet another Tesla has crashed into the back of an emergency vehicle that was stopped on a California highway. At 11:00 am on May 29, police in Laguna Beach, California, tweeted, “This morning a Tesla sedan driving outbound Laguna Canyon Road in ‘autopilot’ collides with a parked @LagunaBeachPD unit.” The police officer at the scene was away from the vehicle at the time of the crash. The driver of the Tesla suffered minor injuries according to a report in The Mercury News.
Tesla has reiterated its claim that people using Autopilot must remain alert and in full control of their vehicles. What else can be expected? “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel and maintain control of the vehicle at all times,” a spokesperson for the California company said. “Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and before a driver can use Autopilot, they must accept a dialogue box which states that ‘Autopilot is designed for use on highways that have a center divider and clear lane markings.’”
Take a look at the photo above. Do you see any center lane divider in that picture? Perhaps if police started handing out moving violations that carry significant fines for using Autopilot improperly, drivers would start to get the message. If insurance companies decided not to cover repairs for cars damaged by owner negligence, that would send a strong message to the Tesla community as well. Of course, risk of death should also be a strong deterrent.
The False Positive Conundrum
The issue isn’t that software engineers are unable to program a self-driving car to avoid stopped vehicles. They can. The issue is that if they do so, there is a substantial risk that the cars will go into emergency braking mode every time their computer “sees” a highway sign above or in certain locations along the side of the road. In order to avoid “false positives,” the computer is programmed to ignore stationary objects, including vehicles.
As Wired pointed out last January, other companies with semi-autonomous driving systems suffer from the same weakness. For instance, in the Volvo owner’s manual for cars equipped with its Pilot Assist system, the Swedish company warns if a car ahead swerves to avoid a stopped vehicle, “Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed. The driver must then intervene and apply the brakes.” Since the system is designed to maintain a preset speed and since it is also programmed to ignore stationary objects, the car will attempt to speed up even as it hurtles headlong toward certain disaster.
Is Lidar The Answer?
The Guardian reports many experts believe lidar sensors are critical to creating advanced autonomous driving systems that avoid the false positive problem. But they are expensive and, because today’s sensor have a limited field of vision, several will be needed on every vehicle to not only see directly ahead but also above and below a driver’s typical line of sight.
Cost is a significant issue. It appears that Uber reduced the number of lidar units in its self-driving vehicles to save money. The limited number of sensors may have been a contributing factor leading to the death of a pedestrian who was struck and killed by an Uber test car in Tempe, Arizona, earlier this year. The head of Mobileye told Reuters recently that full Level 5 self-driving systems will probably cost about $12,000 per vehicle. On the flip side, during the Q4 Tesla earnings call last February, Elon Musk called lidar “expensive, ugly, and unnecessary.”
As consumer groups call for Tesla to stop using the name Autopilot, which they say implies the system can do more than it actually is capable of, perhaps the proper response is to start focusing on clueless humans who insist on doing things after being told specifically not to do them. The fact that there are more than 40,000 highway fatalities a year in the United States today means there is little reason to believe system error is always the cause of highway collisions where Autopilot or other semi-autonomous technologies are involved.
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Former Tesla Battery Expert Leading Lyten Into New Lithium-Sulfur Battery Era — Podcast:
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...