Autonomous Systems Can Make Driving More Stressful

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Much has been written about the relationship between human drivers and autonomous driving technology. As more autonomous cars take to the nation’s streets and highways, though, much more will be explored and learned.

autonomous car

Most people would be shocked to learn that if self-driving cars eliminate half of all highway fatalities, that necessarily means they will be unable to eliminate the other half. Which means there are going to be a lot more news stories about autonomous cars killing people, as happened in Tucson in March. And that’s precisely the problem. As human beings, we have a built-in bias that assumes once a computer works correctly once, it will work correctly every time.

Of course, we all know about the word ass-u-me by now, don’t we? Nevertheless, idiots still get behind the wheel of semi-autonomous cars, set the car to drive itself, and then slide over into the passenger seat to take a snooze, hit up our friends on Facebook, or clip our toenails.

Tesla has been of two minds about this issue. On one hand, it exhorts drivers to always pay close attention to the road ahead and be prepared to take over control of the vehicle if circumstances require. It even has a graphic known as the “red hands of death” that appears in front of the driver if the system detects a lack of attention. On the other hand, it insists on calling its suite of driver assistance programs Autopilot and promising a car that will soon drive without human assistance from LA to Manhattan. (Though, that trip was supposed to be made by the end of 2017.)

The moment in time when the computer says, “Whoa, I am way over my head here,” and when the driver resumes control is known as the “hand-off period.” It can be several seconds long. New research for the Human Factors and Economic Society by Eric Greenlee, Patricia DeLucia, and David Newton studied this hand-off period and found the longer a car operates in self-driving mode, the longer it takes for human drivers to react and reassert control when necessary.

Greenlee, an assistant professor of human factors, says, “State-of-the-art vehicle automation systems are designed to safely maintain lane position, speed, and headway without the need for manual driving. However, there are some situations in which the automation system may fail without warning. To compensate for this, drivers are expected to remain vigilant, continuously monitor the roadway, and retake control of their vehicle should the need arise, but past research has shown that a person’s ability to remain vigilant declines as a function of time.”

In the study, which is published in Human Factors: The Journal of the Human Factors and Ergonomics Society, drivers detected 30% fewer hazards at the end of a 40 minute drive than at the beginning. They also tended to react more slowly to hazards as the drive progressed. Additionally, participants reported in a post-task questionnaire that monitoring for automation failures was difficult and stressful.

“Our results demonstrate that there are high costs associated with the need for sustained supervisory duty in automated vehicles,” Greenlee says. “And the expectation that a human driver will provide reliable, attentive oversight during vehicle automation is untenable. Monitoring for automation failures can be quite demanding and stressful, suggesting that vehicle automation does not ensure an easy or carefree driving experience. As a result, vigilance should be a focal safety concern in the development of vehicle automation.”

Look at it this way. Elon Musk says one day self-driving cars will be as common as automatic elevators. So let’s assume you are on your way to a meeting on the 88th floor of the Petronas Towers in Kuala Lumpur. You get into an elevator in the lobby expecting to be whisked skyward, but receive a warning before the doors close that you must pay careful attention during the ascent and be prepared to take over manual control of the elevator at any time, lest it plummet into the sub-basement, killing you and any unfortunate souls who happen to be in the cab with you. Are you feeling stressed yet? Oh, you betcha.

The point is, machines can be improved and there is little doubt that one day they will drive better than humans can. They don’t get tired, they don’t stop on the way home for a martini or two, they don’t turn to ogle people in the next lane, and they don’t need to use the restroom along the way.

The problem is that people are not going to improve. Their attention will still wander, they will continue to fall asleep behind the wheel, and they will always need a few seconds (or more) to realize the computer has passed the driving duties back to them. The interim period between now and the Level 5 autonomous driving future is fraught with danger. That is precisely where regulators need to focus their attention to minimize the damage that will inevitably occur during the transition period.

To paraphrase William Shakespeare, “The fault, dear Brutus, is not in our autonomous driving technology, it is in ourselves.”


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and doesn't really give a damn why the glass broke. He believes passionately in what Socrates said 3000 years ago: "The secret to change is to focus all of your energy not on fighting the old but on building the new." You can follow him on Substack and LinkedIn but not on Fakebook or any social media platforms controlled by narcissistic yahoos.

Steve Hanley has 5456 posts and counting. See all posts by Steve Hanley