Connect with us

Hi, what are you looking for?

CleanTechnica
IIHS logo
Credit: IIHS https://www.iihs.org/news/detail/64-vehicles-earn-2020-iihs-awards-thanks-to-state-of-the-art-safety

Autonomous Vehicles

IIHS Urges Better Monitoring Of Drivers Using Autopilot & Super Cruise

IIHS has sent a letter to NHTSA urging new regulations on Level 2 self-driving systems.

The Insurance Institute For Highway Safety has sent a letter to the National Highway Traffic Safety Administration urging it to adopt regulations that would govern the operation of Level 2 self-driving technologies such as Tesla’s Autopilot and Cadillac’s Super Cruise. What bothers the IIHS is that both systems do a poor job of monitoring driver engagement while the systems are activated. Autopilot relies on detecting torque on the steering wheel while Super Cruise relies on monitoring the movement of the driver’s eyes.

IIHS president David Harkey tells CNN Business the recommendations contained in the letter to NHTSA were made because currently there are no regulations that apply to such partial self-driving systems. “We thought it was time to put these recommendations down on paper,” Harkey told CNN Business. “Let’s make sure these systems are designed in a way that maximizes safety and do not create unintended consequences.”

IIHS logo

Image credit: IIHS

The list of recommendations for tracking a driver’s engagement with the vehicle includes monitoring eye movements, blinking, head tilt, steering wheel input, and the speed of responses to alerts. At the present time, NHTSA says it recommends developers of systems, like Autopilot and Super Cruise, “incorporate appropriate driver-vehicle interaction strategies” but provides no details about what strategies it considers appropriate.

IIHS also advocates for eliminating automatic lane changes and for restricting the use of self-driving systems to limited access highways. In fact, Super Cruise can only be activated on certain highways that are identified by GPS data. While Tesla strenuously recommends its Autopilot only be activated on roads where its use is appropriate, in practice it can be activated in a wide variety of driving situations, some of which may be highly inappropriate. Tesla takes the position that it is not its job to nanny its customers, while IIHS seems to feel a little more nannying would be appreciated.

IIHS cites a study done in 2018 in which drivers drove for 30 minutes in cars equipped with Level 2 self-driving systems. Without warning, the lead car moved aside to avoid a simulated object in the road that could be either a car or a plastic bag. 28% of the drivers crashed into the object.

The researchers concluded, “The results uncover the important role of expectation mismatches, showing that a key component of driver engagement is cognitive (understanding the need for action), rather than purely visual (looking at the threat), or having hands on wheel. Automation needs to be designed either so that it does not rely on the driver or so that the driver unmistakably understands that it is an assistance system that needs an active driver to lead and share control.”

Government regulations are not very popular these days, but a Wild, Wild West approach seems to be developing among car companies eager to offer the next new thing to their customers. Clearly marketing considerations should not take precedence over public safety. Would those who oppose regulations be comfortable flying in a world with no rules governing pilot training or the number of hours a jet engine can operate before it is overhauled?

NHTSA may or may not promulgate rules for self-driving systems like Autopilot and Super Cruise, but it seems clear that giving humans — as rambunctious and undisciplined as they are — access to self-driving systems that are only partly effective leaves the door wide open for misuse, whether deliberate or unintentional. If you don’t believe that, try binge watching America’s Funniest Home Videos for a weekend.

Then ask yourself this question: How much do you trust the driver coming toward you in the opposite lane with a self-driving system engaged to be paying close attention and ready to take over control of the car in an emergency? If your answer is less than “100% absolutely, all the time,” you may find the idea of some basic regulations reassuring.

 
 
 
Appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.
 

Don't want to miss a cleantech story? Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
 

Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Advertisement
 
Written By

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. 3000 years ago, Socrates said, "The secret to change is to focus all of your energy not on fighting the old but on building the new." Perhaps it's time we listened?

Comments

You May Also Like

Cars

That SUV or crossover you bought to keep your kids safe may not be as safe as you thought — and the IIHS just...

Autonomous Vehicles

If you’re friends with anyone on “Tesla Twitter,” you’ll see the occasional video of absolutely terrible driving, often tagged as #HumanPilot. Why? Because Tesla...

Consumer Technology

A new survey from the Insurance Institute for Highway Safety (IIHS) shows that smartphone app usage among gig-economy workers is 4 times more likely...

Autonomous Vehicles

When it comes to irresponsible use of systems like Autopilot, FSD Beta, Super Cruise, or Blue Cruise, there tends to be a lot of...

Copyright © 2023 CleanTechnica. The content produced by this site is for entertainment purposes only. Opinions and comments published on this site may not be sanctioned by and do not necessarily represent the views of CleanTechnica, its owners, sponsors, affiliates, or subsidiaries.