Published on November 9th, 2017 | by Steve Hanley0
Autonomous Cars Don’t Have To Be Perfect To Save Lives
November 9th, 2017 by Steve Hanley
The battle to be firstest with the bestest self-driving car technology is heating up. Every major car manufacturer is developing systems for autonomous cars, but so are lots of tech companies — from Waymo to Uber and Apple. Tesla is in the middle — a tech company that also makes cars. Its Autopilot system may be the gold standard at the moment, but there are lots of other competitors breathing down its neck.
Autonomous cars have a lot of hurdles to clear before a person can climb into one, set a course for the coast, and go to sleep. Highway driving is pretty straightforward, but once surface streets become involved, there are stop signs and traffic lights to consider. Bicyclists and pedestrians seem to come out of nowhere. Then there are the jerks who drive the wrong way on a one-way street or blow through a red light while texting. Devising systems that can handle all those challenges safely is a daunting task.
But self-driving systems don’t have to be perfect to save tens of thousands of lives, says the RAND Corporation in a new report entitled The Enemy of Good: Estimating the Cost of Waiting for Nearly Perfect. Autonomous vehicles systems that are only slightly better at driving than a human could prevent hundreds of thousands of fatalities worldwide over the next 30 years. There is no reason to wait until autonomous cars are 75% or 90% better, the report suggests.
“Our work suggests that it is sensible to allow autonomous vehicles on America’s roads when they are judged to be just moderately safer than having a person behind the wheel,” said Nidhi Kalra, co-author of the study. “If we wait until these vehicles are nearly perfect, our research suggests the cost will be many thousands of needless vehicle crash deaths caused by human mistakes. It’s the very definition of perfect being the enemy of good.”
That’s the theory. The reality is that less than perfect autonomous cars will still get into accidents, some of them lethal, and society will react. RAND estimates that when all the data is in and tabulated, highway fatalities in America for 2016 will top 40,000. Society seems to be willing to accept that level of carnage on the streets when humans are involved because, hey, we’re only human, right? But will people be equally willing to accept road deaths when a computer is involved in the driving process?
“This may not be acceptable because society may be less tolerant of mistakes made by machines than of mistakes made by people,” says David Groves, a co-author of the study and co-director of RAND’s Water and Climate Resilience Center. “But if we can accept that early self-driving cars will make some mistakes — but fewer than human drivers — developers can use early deployment to more rapidly improve self-driving technology, even as their vehicles save lives.”
The problem, as RAND sees it, is that it will take decades to accumulate the data necessary to verify that self-driving systems are 100% effective or nearly so. In the meantime, large numbers of people will die in vehicle collisions. Many of those fatalities could be avoided if self-driving systems were approved for public use sooner rather than later. It’s the “greater good” conundrum. Is it better to let one driver die so that two others might be saved?
Many states now permit companies to test their autonomous cars on public streets, but they always have a human driver riding shotgun, just in case. Michigan passed a law last year permitting the sale of self-driving cars within its borders, even though there are no Level 5 fully autonomous cars available yet.
The chances are the same thing will happen when cars with some self-driving ability start crashing into each other — as happens when an electric car catches on fire today. Even though thousands of conventional cars go up in flames every day, the news never reports on those events. But every battery fire garners worldwide attention from the press, especially if that car is a Tesla.
There will be bumps along the road to fully autonomous cars. It can be argued it is better for society to save some lives even if it is not possible to save them all. Would the US be better off if highway fatalities were cut to “only” 20,000 a year instead of 40,000? No doubt the trial lawyers representing the next of kin of those who do die in car accidents where computers are involved in the driving process will have something to say on that subject.
Source: Science Daily