Connect with us

Hi, what are you looking for?

CleanTechnica
The need for government oversight (in my opinion) comes down to a need to prove just a few basic safety capabilities -- system awareness of nearby pedestrians and the lack of blind-zones (a "vision test"); effectiveness at braking when the need is there; and effective lane-keeping as well as traffic sign/rule adherence.

Autonomous Vehicles

Should Self-Driving Car Testing Be Dependent Upon The Meeting Of Government Standards? Or Passing A Vision Test?

The need for government oversight (in my opinion) comes down to a need to prove just a few basic safety capabilities — system awareness of nearby pedestrians and the lack of blind-zones (a “vision test”); effectiveness at braking when the need is there; and effective lane-keeping as well as traffic sign/rule adherence.

With the recent pedestrian fatality in Arizona caused by one of Uber’s self-driving test vehicles, and the fatality in California that occurred shortly after a Tesla Model X driver activated “Autopilot” still in the news, it seems worth reflecting here on the potential need for government standards in the sector.

In this context, what I’m referring to when I say “government standards” is some sort of way of ensuring that minimum safety levels are maintained when testing self-driving vehicles.

In the case of the self-driving Uber test vehicle it seems likely that the pedestrian fatality in question could have been prevented if Uber’s system had been properly vetted (and likely found unsafe due to the presence of a large pedestrian blind-zone, as we reported previously).

With regard to the fatality in the Tesla Model X, it remains unclear what exactly happened — as the driver seems to have only activated Autopilot (simply and advanced and very effective cruise-control system despite the name) just before the accident, by some reports.

At any rate, responsibility for what happens when using Tesla’s Autopilot system is ultimately on the driver, although the argument can be made that such systems are inherently unsafe, due to the need for unrealistically-fast driver response times in the case of danger. (A final note on that count, the accident would have likely not been a fatality if the large steel safety device preceding the offramp wall hadn’t been missing at the time — having been removed due to an accident a few weeks earlier.)

Back to the subject at hand, the need for government oversight (in my opinion) comes down to a need to prove just a few basic safety capabilities — system awareness of nearby pedestrians and the lack of blind-zones (a “vision test”); effectiveness at braking when the need is there; and effective lane-keeping as well as traffic sign/rule adherence.

Something else that is probably needed as well is clarity when it comes to driver/oversight responsibility when it comes to potential accidents — e.g., smartphone-caused fatalities should be treated the same whether in a human-driven vehicle or a self-driving test vehicle being watched by a “safety driver.”

Another need is likely a better way of dealing with the tendency of safety drivers to quickly lose awareness of the road when overseeing the testing of self-driving systems.

On this subject, Reuters recently published an interesting article featuring quotes from various industries “experts” that seems worth highlighting here.

Here are some excerpts:

“A consumer group, Advocates for Highway and Auto Safety, says a bill on self-driving cars now stalled in the US Senate is an opportunity to improve safety, quite different from the bill’s original intent to quickly allow testing of self-driving cars without human controls on public roads. The group has proposed amending the bill, the AV START Act, to set standards for those vehicles, for instance, requiring a ‘vision test’ for automated vehicles to test what their different sensors actually see.”

“The group believes the bill should also cover semi-automated systems like Tesla’s Autopilot — a lower level of technology than what is included in the current proposed legislation. Other groups have also put forth proposals on self-driving cars, including requiring the vehicles and even semi-automated systems to meet performance targets, greater transparency, and data from makers and operators of the vehicles, increased regulatory oversight, and better monitoring of and engagement with human drivers.”

“Others want to focus on the human driver…The Massachusetts Institute of Technology is doing tests using semi-automated vehicles including models from Tesla, Volvo, Jaguar Land Rover, and General Motors Co. The aim is to see how drivers use semi-autonomous technology — some watch the road with their hands above the wheel, others do not — and which warnings get their attention.”

This seems to support the many industry figures out there now that say that dangers come with the use of self-driving systems below full autonomy, with the idea being that many drivers aren’t willing to use such systems in a safe manner. Obviously though, it should be noted, some drivers/users are willing to responsibly use systems such as Tesla’s Autopilot, but what percentage do those people represent among total Autopilot system users?

This may all end up as a moot point before too long, though, if Tesla’s system becomes safe enough that it can negate the need for driver oversight. In such a case, presumably, Tesla itself would then have to take some legal responsibility — that being the case, perhaps the current legalese fine-print placing all responsibility for Autopilot accidents on the driver will remain indefinitely?

 
I don't like paywalls. You don't like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don't like paywalls, and so we've decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It's a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So ...
If you like what we do and want to support us, please chip in a bit monthly via PayPal or Patreon to help our team do what we do! Thank you!
Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!
 

Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.
 

Written By

James Ayre's background is predominantly in geopolitics and history, but he has an obsessive interest in pretty much everything. After an early life spent in the Imperial Free City of Dortmund, James followed the river Ruhr to Cofbuokheim, where he attended the University of Astnide. And where he also briefly considered entering the coal mining business. He currently writes for a living, on a broad variety of subjects, ranging from science, to politics, to military history, to renewable energy.

Comments

You May Also Like

Cars

It seems like most Americans are routinely hounded by annoying, scammy, spam phone calls telling you that you need to buy an extended warranty...

Batteries

Long distances, oilfields, and low salaries abound south of the Rio Bravo. If you ask me, it’s mainly these three conditions that have marked...

Cars

When Tesla switched some of its cars’ steering wheels out for an airplane-style yoke, there was a lot of controversy. Some of it was...

Cars

The Tesla Model Y was the 2nd best selling automobile in the overall German auto market in February (an "off-peak month" for Tesla).

Copyright © 2023 CleanTechnica. The content produced by this site is for entertainment purposes only. Opinions and comments published on this site may not be sanctioned by and do not necessarily represent the views of CleanTechnica, its owners, sponsors, affiliates, or subsidiaries.

Advertisement