Tesla’s Trouble With Semi Trucks & Another Shakeup Of The Autopilot Team — Is There A Connection?

Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Editor’s note: Context is always worth keeping in mind. The following story is interesting and important, but in the grand scheme of traffic safety, one should consider not only the two deaths mentioned here, but also how many lives Tesla Autopilot has saved — which is likely a much higher number than two.

On March 1, Jeremy Banner was killed on a Florida highway when his Tesla Model 3 slammed into a tractor trailer that was crossing the road. The National Transportation Safely Board sent a team of investigators to find out more about the crash. Last week, it released this preliminary report.

Tesla Model 3 crash NTSB
Credit: NTSB

“Preliminary data from the vehicle show that the Tesla’s Autopilot system — an advanced driver assistance system (ADAS) that provides both longitudinal and lateral control over vehicle motion — was active at the time of the crash. The driver engaged the Autopilot about 10 seconds before the collision. From less than 8 seconds before the crash to the time of impact, the vehicle did not detect the driver’s hands on the steering wheel. Neither the preliminary data nor the videos indicate that the driver or the ADAS executed evasive maneuvers.”

The circumstances surrounding this fatality and the crash that killed Joshua Brown when his Tesla Model S also collided with a tractor trailer crossing a Florida highway three years ago are troubling. Why does Tesla’s vaunted Autopilot system have such difficulty recognizing an 18 wheeler crossing the road in front of it?

The Verge reports that a spokesperson for Tesla phrased what happened slightly differently. Instead of saying “the vehicle did not detect the driver’s hands on the steering wheel,” that person said, “the driver immediately removed his hands from the wheel.” Tesla did not respond to a request from The Verge to explain what action it plans to take to address the problem of a semi truck crossing a car’s line of travel.

Why Can’t Computers Think Like Humans?

The issue is the same for all self-driving cars — how do you make them behave like human drivers when that should be the minimum (and better than humans at other times)? People in Phoenix report that they are sick to death of Waymo’s self-driving cars faltering at intersections, unsure what to do. Because they have been programmed to err on the side of caution at all times, they wait an inordinate amount of time at stop signs, checking to make sure there is a clear path ahead before moving forward. Drivers waiting behind get exasperated and start blowing their horns or zoom around the obstructionist vehicles, creating unsafe traffic situations for others. It’s kind of silly to blow your horn at a computer that has no auditory input, but there you are.

It’s not that the Tesla Autopilot system doesn’t see trucks crossing in front. The issue is that it has been programmed to ignore them. As we drive down the road, our brain easily distinguishes between a tractor trailer and an overpass, but that is difficult for a computer to do. If it decides to slam on the brakes every time it approaches a highway overpass, that’s not good. To a computer, a truck trailer and a highway overpass look too similar, so the program tells it to ignore both, with predictable results.

Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University, has some insight into this this situation. Other than Tesla itself, Carnegie Mellon is ground zero for autonomous driving development in the world. He tells The Verge that in most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common. The algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle.

“Essentially, the same incident repeats after three years,” Rajkumar says. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations, he explains. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added.

Some Input From The Real World

Twitter user GreenTheOnly has some interesting input on this situation which he posted on March 13, nearly two weeks after Jeremy Banner died.

He reports his Tesla is running firmware version 19.8.1, and 25 seconds of video show the truck crossing his path and what happened afterwards — available in a subsequent tweet.

Driver Overconfidence

In the past, Tesla CEO Elon Musk has blamed crashes involving Autopilot on driver overconfidence. “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” he said last year.

That may be true of the current generation of Autopilot-equipped cars but it is not nearly good enough for the level of autonomy Musk boasted about during the recent Autonomy Day event when he said every Tesla equipped with the latest-generation autonomous driving hardware and the company’s new self-driving computer would be able to operate with almost no input from a human driver soon. “A year from now, we’ll have over a million cars with Full Self Driving computer, hardware, everything.”

He went on to say, “We expect to be feature complete in self driving this year, and we expect to be confident enough from our standpoint to say that we think people do not need to touch the wheel and can look out the window sometime probably around … in the second quarter of next year. And we expect to get regulatory approval, at least in some jurisdictions, for that towards the end of next year. That’s roughly the timeline that I expect things to go on.”

Chip in a few dollars a month to help support independent cleantech coverage that helps to accelerate the cleantech revolution!

Skeptics Abound

There are plenty of skeptics. Writing in Forbes, Lance Eliot, a recognized authority in artificial intelligence and self-driving cars, says, “To some, it would be like running a marathon, which the best in the world can do in about 2 hours, and suddenly suggesting that you’ll be at the finish line in just thirty minutes, somehow skirting past all known laws of nature and physics. It’s a jaw dropping kind of declaration, especially if there isn’t any substantive evidence showcased to support a potential world-record-breaking projected achievement. At this point, it’s a wait-and-see status.”

The issue for Tesla and all self-driving systems is one of false positives. Radar outputs of detected objects are sometimes ignored by the vehicle’s software to deal with the generation of “false positives,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University. Without these, the radar would “see” an overpass and report that as an obstacle, causing the vehicle to slam on the brakes.

On the computer vision side of the equation, the algorithms using the camera output need to be trained to detect trucks that are perpendicular to the direction of the vehicle, he added. In most road situations, there are vehicles to the front, back, and to the side, but a perpendicular vehicle is much less common.

“Essentially, the same incident repeats after three years,” Rajkumar said. “This seems to indicate that these two problems have still not been addressed.” Machine learning and artificial intelligence have inherent limitations. If sensors “see” what they have never or seldom seen before, they do not know how to handle those situations. “Tesla is not handling the well-known limitations of AI,” he added.

Autopilot Team Reshuffle?

We may never find out whether the death of Jeremy Banner is in any way connected to rumors on Reddit that Elon is reshuffling the Autopilot team and personally taking over its management. (Note: he has already led a weekly Autopilot team meeting for a few years.)

Remember what happened after the Joshua Brown fatality. Within months, Tesla and Mobileye went through a messy divorce, and Tesla had to spend a good chunk of time building its own Mobileye-like system. In this latest case, what will the ramifications be for Tesla, and the world?

Fatalities & Robotaxis

No firm conclusions can be drawn at this point, except to say that the final report from NTSB could put a crimp in Musk’s promise that Tesla robotaxis could make it into a $500 billion company in a few years. If regulators get cold feet about approving the use of self-driving Teslas for revenue service, that aspect of Musk’s long-term plans may get put on hold indefinitely.

Make no mistakes — the long knives are out and ready to stab Tesla in the back if the opportunity presents itself. With Tesla continuously leading it, the EV revolution could be set back years or even decades, especially in the US where fossil fuel interests with unlimited resources control the federal government and many states. We know one key component of any plan to reduce carbon emissions is to electrify everything, starting with the transportation sector. It is important, in my opinion, that the desire for self-driving cars not delay that process.


Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Latest CleanTechnica TV Video


Advertisement
 
CleanTechnica uses affiliate links. See our policy here.

Steve Hanley

Steve writes about the interface between technology and sustainability from his home in Florida or anywhere else The Force may lead him. He is proud to be "woke" and doesn't really give a damn why the glass broke. He believes passionately in what Socrates said 3000 years ago: "The secret to change is to focus all of your energy not on fighting the old but on building the new."

Steve Hanley has 5437 posts and counting. See all posts by Steve Hanley