
A woman was killed crossing a street in Tempe, Arizona, at 10 o’clock at night last Sunday. The car that hit her was a specially modified Volvo XC90 owned by Uber. It was conducting a real-world test of Uber’s autonomous driving technology at the time of the incident. Two days after the accident, the Tempe police released footage of the exterior and interior views taken by the car’s dashcam just prior to the collision. Based on that footage, the chief of police in Tempe offered an opinion that no human or artificial driver could have avoided the woman in the road and that Uber had done nothing wrong.
We have published several stories based on this tragedy. In the comments, opinion is evenly divided between those who agree with the chief of police and those who don’t. Whatever your opinion, there is little doubt this event is going to generate a lot of scrutiny of self-driving systems locally, nationally, and around the world. Since this is the first known casualty involving a pedestrian and a self-driving cars, how could it not?
Experts are people who get paid to have an opinion. As the world waits to see if the a lawsuit will be filed against Uber by the heirs of the deceased, the experts are lining up for their moment in the sun. “This is exactly the type of situation that Lidar and radar are supposed to pick up,” said David King, an Arizona State University professor and transportation planning expert. “This is a catastrophic failure that happened with Uber’s technology.” according to a report in The Guardian.
Bryant Walker Smith, a University of South Carolina law school professor and autonomous vehicle expert, has looked at the video and decided it “strongly suggests a failure by Uber’s automated driving system and a lack of due care by Uber’s driver.” In an e-mail, he pointed out that about 2 seconds elapsed between the time the pedestrian first appeared in front of the car and the moment of impact. “This is similar to the average reaction time for a driver. That means an alert driver may have at least attempted to swerve or brake.”
“Alert,” being the operative word in that statement. We like to think we are always alert behind the wheel every second of every journey, scanning the road ahead for potential hazards, but that just isn’t how the world works. Computers are supposed to be hyper-vigilant, protecting us from our own human failings. So why didn’t the Uber system do precisely that?
“I really don’t understand why Lidar didn’t pick this up,” said Ryan Calo, a University of Washington law professor and self-driving expert. “This video does not absolve Uber.” John M Simpson is a privacy and technology project director with Consumer Watchdog. He told The Guardian the video shows a “complete failure” of Uber’s technology and its safety protocols. He recommends all testing programs on public roads be suspended while the case is under investigation.
“Uber appears to be a company that has been rushing and taking shortcuts to get these things on the road,” he said. “It’s inexcusable.” The state of Arizona also comes in for criticism from him. He claims Arizona political leaders lured the corporation with promises of fewer regulations after Uber got into a tussle with California about its autonomous car testing protocols.
Apparently The Guardian was unable to locate any self-driving technology experts who weren’t willing to fault Uber and its autonomous driving system. They did, however, make it easier for any attorney who takes the case to line up witnesses for the plaintiff — for a fee, of course. Notice that all of these “experts” have formed their opinions based on viewing the dashcam footage alone. None of them has had an opportunity to examine the actual data that was fed to the car’s self-driving computer. Being an expert means never having to say you’re sorry.
This incident will likely prompt changes in how the Uber autonomous driving system functions. When Joshua Brown died behind the wheel of a Tesla Model S operating in Autopilot mode on a Florida highway in 2016, it sent shock waves through the autonomous driving community worldwide. It caused a messy divorce between Tesla and MobilEye and a complete rethink of how the Tesla system functions. The camera — supplied by MobilEye — was demoted to a secondary role and front facing radar was elevated to be the primary source of input for the system.
Interestingly, Waymo has placed its self-driving Chrysler Pacifica vans into commercial service in and around Phoenix, Arizona with no “safety driver” on board. No incidents arising from that service have been reported. Is Waymo that far ahead of Uber? Possibly. The two companies had a titanic struggle over Lidar technology in court recently.
But why should members of the public be involved in a competition between corporations to see whose technology is better? Are we just lab rats in a game of corporate oneupsmanship? Should there be local, state, or national standards that have to be met before companies are allowed to put autonomous cars on the road to see whether or not they work? These are just some of the questions swirling around as the ramifications of the fatal crash in Tempe begin to emerge.
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Electrifying Industrial Heat for Steel, Cement, & More
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...