Autonomous Vehicles

Published on July 2nd, 2016 | by Zachary Shahan

423

Tesla Model S Autopilot Crash Gets A Bit Scary … + Strong Signs Of Negligence

July 2nd, 2016 by  

Update: For more facts and a timeline, see: Tesla Autopilot Fatality — Timeline & Facts.

A lot more details have surfaced regarding the Tesla Autopilot death that recently occurred in Florida. Unfortunately, they aren’t very positive for Tesla nor with regards to how the driver was apparently using Autopilot.

(Update: an extended quote that was here has been removed to respect the privacy of the person who published it on a public forum but didn’t want it reposted elsewhere. The gist of it was, though, that colleagues of the driver were quite certain that he was on his computer at the time of the accident.)

Sad news, but also not a good sign that the driver was probably working on his computer while driving (with Autopilot on). The reports below indicate that he might not have been working, but rather watching Harry Potter. Either way, it sounds like he probably wasn’t paying close attention to the road.

Furthermore, one woman who was driving on the same road said that she was going 85 mph when he quickly drove past her. We don’t know for certain that he was speeding, though, since there is nothing in the police report or Tesla’s statement regarding that.

Tesla Autopilot crash

Tesla Autoplit crash Florida

Images by Rashomon/Google

All of these points bring up the concern that people (even former Navy SEALs) get complacent with Tesla Autopilot on. This is something that concerned me while testing out Autopilot for the first time. At first, I was very concerned about letting go and was still very attentive, but after a few minutes, even in thick and fast California traffic, I relaxed a lot, and while in conversation with others in the car, my mindset shifted more to that of a passenger. It then hit me that I wasn’t paying close attention to the road. If this happened to me after just a few minutes of driving on Autopilot, I imagine it really happens to people who own a car with Tesla Autopilot and start to trust it more than they should.

And then, there’s the really concerning bit about the technology itself. Apparently, even after having its roof torn off, the Tesla Model S kept driving, through fences and such, until crashing into a pole. Not the kind of thing that’s going to sell more Autopilot Teslas, get regulators to relax, or boost Tesla’s stock.

Naturally, as Tesla’s disclaimers say, Autopilot is in beta and drivers are supposed to remain alert and ready to take over at any time. Nonetheless, this is a rather chilling story.

Below is more information on the story from Steve Hanley at Gas2.


New Details About Fatal Tesla Crash Emerge

UPDATE July 2 9:00 am.  According to Automotive News, “There was a portable DVD player in the vehicle,” said Sergeant Kim Montes of the FHP in a telephone interview.

The story about the man who died in a crash while driving his Tesla Model S in Autopilot mode on a highway in Florida is all over the news. When the story broke a few days ago, I first assumed it happened this week. I was surprised to learn later that it actually happened on May 7. I do this stuff for a living (sort of) and so I monitor news about Tesla and Elon Musk fairly closely. Until Wednesday, there was not a single report anywhere on the internet about a fatal crash involving a Tesla back in May. Then Tesla apparently finally found out about the crash and NHTSA got involved.

First Tesla Autopilot death

At first, details were sketchy. We heard that a tractor trailer was making a left turn at an intersection on a four-lane highway with a wide median strip. There were no traffic lights at the intersection. These sorts of road junctions are quite common in Florida and other parts of the country. Whether the tractor trailer was turning left or executing a U turn is still unclear to me.

In any event, my first reaction was that a large vehicle like that should not have pulled out in front of oncoming traffic. Perhaps the truck driver was partially at fault? If you are building an autonomous driving system, how do you program it to anticipate every stupid thing a human being is capable of, whether it is a truck turning into the path of the car or a clueless pedestrian stepping off a curb into traffic while engrossed in the morning newspaper and drinking a flat white latté?

Now, details are beginning to emerge and they are disturbing. First, the driver of the Tesla was Joshua D. Brown, of Canton, Ohio. He was 40 years old and a Navy SEAL for 11 years. He left the Navy in 2008, according to the Pentagon. He was a very tech-savvy fellow. He was the founder of his own internet network and camera company, according to a report in the Dallas Morning News. He posted videos about his Tesla experiences on several occasions, including one showing how the Autopilot system in his car once saved him from a potentially dangerous collision when a white commercial truck swerved suddenly into his lane.

A woman driving on the same highway and in the same direction as Brown claims she was driving 85 mph when Brown’s Model S flew by her at a high rate of speed. That information is not included in the official traffic accident report filed by the Florida State Police, but is surely something known to Tesla, as it has access to all of the data stored in the car’s computer system. It is included in a story at Teslarati.

The driver of the tractor trailer, Frank Baressi, age 62, told the Dallas Morning News in a telephone interview that the Tesla driver was “playing Harry Potter on the TV screen.” He added, “It was still playing when he died.” Baressi said, “he went so fast through my trailer I didn’t see him.” He didn’t see the video playing, but claims he could hear the movie still playing when the Tesla finally came to a stop several hundred yards up the road. Tesla Motors says it is not possible to watch videos on the Model S touchscreen, but word is that Brown had a portable DVD player in the car.

The really scary part is that not only did the sensors in Brown’s car fail to detect the tractor trailer directly in front of it, the car itself continued to drive down the highway for several hundred yards after its roof was sheared off. It finally came to a stop in the yard of a home owned by Bobby Vankavelaar. He told ABC Action News that the Tesla traveled “hundreds of yards from the point of impact, through a fence into an open field, through another fence and then avoided a bank of trees before being unable to swerve and miss a power pole that eventually stopped the car a few feet away.”

So, is this a ‘perfect storm’ event? Someone driving much too fast while distracted, coupled with a truck driver who looked but didn’t see a car approaching in the opposite direction, in unfortunate lighting for a white truck? And what impact will this have on the advent of autonomous driving technology? Mike Harley, an analyst at Kelley Blue Book, says systems like Tesla’s, which rely heavily on cameras, “aren’t sophisticated enough to overcome blindness from bright or low contrast light.” He says to expect more deaths as the autonomous technology is refined.

Karl Brauer, a senior analyst with Kelley Blue Book, said the crash is a huge blow to Tesla’s reputation. “They have been touting their safety and they have been touting their advanced technology,” he said. “This situation flies in the face of both.”

Really, Karl? We said in a previous post that fatalities will continue to occur even when more cars start driving themselves. The difference is that, statistically, the likelihood of a fatal accident will be less for autonomous cars than cars operated by human drivers. This unfortunate incident occurred after Teslas worldwide had accumulated more than 130 million fatality-free miles while driving in autonomous mode. Statisticians say a death every 100 million miles is normal. So, the Tesla Autopilot system is already 30% more likely to save your life than if you are driving yourself unaided by computers.

Elon Musk continues to remind people that Autopilot is still only an aid. Drivers must remain alert, aware, and ready to take control at any time. My wife says autonomous cars are like boats towing water skiers. Most states require there be two people in the boat — one to steer and one to watch the skiers. She thinks if you are going to use computers to drive your car while you nap or watch videos, another responsible adult should be required to watch the road ahead.

Her idea is not as fanciful as it may sound. At the beginning of the automotive age, drivers entering a city were required to have a person on foot walk in front of the car sounding a klaxon to warn the citizenry that a motorized vehicle was approaching. Will regulators require something similar now that we know our machines can fail to protect us from all danger?

My guess is that the impetus of technology cannot be denied. Absent deliberate malfeasance on the part of a manufacturer, the world will accept that there is always an element of risk. The technology will get better. Within a few years, when IHS Automotive says 20 million autonomous cars a year will be sold every year worldwide, the chances of a fatal accident occurring in a self-driving car will probably be closer to once every 500 million miles.

Tesla is very lucky this was a single car accident. If another driver had died as a result of being impaled by a speeding Tesla with its roof sheared off, the odds are an army of trial lawyers would descend on the victim’s family, begging for a chance to be the first to sue Tesla. In the end, the fate of autonomous driving technology may not be determined as much by regulators as by the courts.

Screenshot via Teslarati


Check out our new 93-page EV report, based on over 2,000 surveys collected from EV drivers in 49 of 50 US states, 26 European countries, and 9 Canadian provinces.

Tags: , , , ,


About the Author

is tryin' to help society help itself (and other species) with the power of the typed word. He spends most of his time here on CleanTechnica as its director and chief editor, but he's also the president of Important Media and the director/founder of EV Obsession, Solar Love, and Bikocity. Zach is recognized globally as a solar energy, electric car, and energy storage expert. Zach has long-term investments in TSLA, FSLR, SPWR, SEDG, & ABB — after years of covering solar and EVs, he simply has a lot of faith in these particular companies and feels like they are good cleantech companies to invest in.



  • Jonathan Cable

    I agree completely. It sounds like it is time this guy loses his CDL.

  • Ken

    Wrong. I have clearly pointed out that I have not taken anything at face value while proving that you don’t even understand how passive data works in the Tesla auto-pilot system.

    No one ever said Musk was God or perfect so you trying to claim that only shows your own dishonesty or your inability to use your own brain.

    When you actually do some research and maybe start to comprehend how these things actually work, maybe then you will have some credibility.

    • Bob_Wallace

      Ken, you need to paint over 10 and 11 on your dial.

      Try to keep your comments down to 4 or 5. Save the extra volume for things that matter.

  • Bob_Wallace

    Ken, I taught statistics at the university level.

    There are two of us trying to explain to you that you are misinterpreting the data.

  • CaptD

    Having never driven a Tesla or have any technical knowledge of its “radar” warning system, I’d like to ask anyone that might know if puting 3 corner radar reflectors on the sides of semi’s would give them a larger radar cross section?
    Ships at anchor often hoist one up the mast to increase their radar return.

  • Yes, that is in (and bolded in) the update article linked at the top, and I think the TreeHugger article was the reference for it there.

  • John Edward Azhderian

    A computer cannot think — its has to be programed to perform a task like driving a car. You are looking for a panacea to eliminate all auto accidents which is not going to happen. Remember computers are built and programmed by humans not aliens from outer space or by God. Microsoft Windows is full of bugs which I encounter on a regular basis and Google Chrome has issues too.

    • Bob_Wallace

      What you say is roughly correct. But computers don’t get drunk, fall asleep, text and do the other human things that cause about 90% of all accidents.

      Also computers can do things like observe 360 degrees around the car at the same time, use IR and radar to “see” through the dark and fog. Things humans cannot do. Computers will get better than humans for the “last 10%” of all accidents.

      We may never reach 0% but we should approach it very closely. Each time a detection problem is encountered a solution will likely be worked out and that solution, if software, can be installed on all EVs in a very short time (minutes to hours). Cannot do that with humans.

    • “You are looking for a panacea to eliminate all auto accidents which is not going to happen.”

      –I’m not looking for that. But I think it’s safe to say that computers are going to end up being better drivers that humans. And that we will transition to self-driving cars at some point, with steps along the way.

    • Joe Viocoe

      Personal computers/smartphones aren’t designed with reliability as the top priority. They are designed for interoperability with applications from many other companies. You can install millions of potential apps, and even have 3rd party bloatware already installed. Code compatibility is top priority. All of this is the root cause of instability. Because the OS maker puts the app marketplace above the stability.

      In critical systems…… they usually run straight linux or unix, don’t allow any 3rd party software, and only perform a set task. These systems run missile defense, satellites, aircraft, medical devices, etc. These are computers that don’t need to be rebooted for years sometimes, unless scheduled for updates.

  • Niki Kramer

    That “letter from a friend” you posted….did you get rights to that letter or did you simply copy and past it from a forum without any permission?

    • Ken

      Why are you making a nasty, accusatory post about the author and his good friend with no information?

      Are you ignorant?

      • Niki Kramer

        Because it was my post that he quoted. That is why.

        I was informed that he copied and posted it on his story thru the same thread that he trolled to get “information”.

        Have some crow.

        • Hey, sorry, am just seeing this. I think someone misunderstood about whether or not we are “friends” (not that I don’t think you seem like a nice person, just that I don’t think I’ve ever actually seen you on there before and we haven’t interacted).

          Just as a matter of note in case it’s not clear, that’s a public forum and the media has the right to use information published there (or elsewhere) that is relevant to the news. Naturally, as always, I linked to your post as the source.

          Now, I’m gathering you don’t want this information shared here for some reason, which I don’t quite understand since you posted it on a public forum, but since that is the case, I’ll go ahead and remove it.

          Apologies if it was a problem for you in any way. But I’d also advise against publishing info on a public forum if you don’t want it out in the public, especially about newsworthy topics.

        • Ken

          You just proved you would fail a 4th grade test in reading comprehension.

          Zach never said he was your friend. You read it wrong because you are apparently not very bright.

          You also don’t seem able to comprehend that posting on public forums makes your post usable the media.

          Again, not very bright.

          How’s that crow taste, genius?

          • Bob_Wallace

            No name calling, Ken. It’s a bannable offense.

      • Niki Kramer

        Oh, and I am not friends with him. I do not know this author. So, he lied about that as well.

        • Ken

          He never said he was your friend. You read it wrong.

          You need to apologize for your mistake.

          I am waiting.

          • Bob_Wallace

            Ken, you are going overboard.

            The issue had already been addressed. There is no need for you to pile on multiple times.

          • Ken

            I only responded once to every comment I received.

      • Niki Kramer

        I am the person that posted that letter, THAT is how I KNOW. The author’s “information” was copied off of a forum post and then posted as a part of his blog…from a “friend”. HA. I don’t even know this writer. I was given a heads up from other forum posters.

        I wonder how many other posts are plagiarized or lied about.

        • Ken

          He never said he was your friend. You read it wrong. Get someone to help you read it more carefully and then apologize for your mistake.

          I’m waiting.

    • Niki Kramer

      Ken – no, I am not ignorant.

      I was the one that received that letter…..Not Zach.

      Zach trolled the TMC forum and reposted it.

      • Ken

        Wrong. You misread the article. Try reading it again, carefully. He never said you were his friend.

    • Bob_Wallace

      You are, I assume, talking about the part in italics that begins – “I am very close to a few of his former EOD coworkers….”

      That was posted on a public forum. This article copies it over and gives the link to where the comment can be found.

      There is no permission needed to quote a reasonable amount of content from a public source. This is common journalistic practice.

      Here is what Zach wrote – “here’s a note from a friend of the deceased driver’s former coworkers:”

      He did not claim that the letter was to him. The word “note” is a link to where your comment was posted.

      You’ve knotted up your bloomers unnecessarily.

      And “trolled” has a meaning on the web other than how you use it here.

  • Bob_Wallace

    Some “normal” cruise controls have speed limits at which they disengage, some do not.

    Tesla’s autopilot allows higher speed driving on divided highways.

  • ZMANMD

    A lot of people are stating the autopilot feature does not work at more than 5 mph above the speed limit. This speed limit is only applicable to non-divided highways. On divided highways with a median (such as the one in the crash) the driver is free to select whatever speed he wants to risk getting ticketed for. Watching Harry Potter (or any other movie) is not something a driver should ever do on the road. That speed limit loophole may soon be closed by a new update on all Teslas. Now the lawyers will slap a limit on all speed, regardless of road type. So soon the autopilot cars will be in the slow lane with all the semi-trucks. There’s always one who takes it too far and ruins it for everyone else.

    • Bob_Wallace

      The autopilot feature may not work at speeds higher than ~90 MPH. Someone with a Tesla might be able to confirm that. My info comes only from a comment.

      • Ken

        Yes, you are right. 90 is the max.

  • SOPA_NOPA

    While this raises some interesting issues regarding failsafes for autonomous vehicles, I’m not sure I find it particularly alarming that the car travelled “several hundred yards” after having the roof sheared off.

    What exactly is the expectation here? That it should violate conservation of momentum and freeze in place? There is a minimum physical limit to this based on the speed of the vehicle and it’s braking ability. Given that this guy was apparently hauling ass and keeping reacting times and damage to the vehicle in mind, this might be a couple hundred feet with a driver doing all he can under good conditions.

    And what is the expectation for a dead, headless human driver making a controlled stop? At that point the vehicle may have been out of control by definition, but there is (sadly) probably some interesting data for Tesla in analyzing what the vehicle was trying to do immediately after impact, because it may be enough to determine if it was trying to stop or trying to continue driving, which would give them guidance on determining when system integrity has been compromised to the point that the car needs to return control to the driver or just stop.

    It would have been genuinely alarming if the thing had kept driving until the battery went dead.

    Also, count me as shocked (shocked) that it is possible to set the autopilot to exceed the posted speed limits.

    • Bob_Wallace

      300 to 400 feet.

      If the driver reacted prior to the crash then the vehicle should not have proceeded on under power. At least nothing more than a crawl. The cruise control would have been canceled and it’s unlikely anything pushed down on the accelerator after the crash.

      (But rule nothing out at this time.)

      I wonder if the car is programmed to stop after air bags deploy?

      Looking forward to reliable information….

      • Joe Viocoe

        Do we even know if airbags deployed when it struck the trailer, or when it struck the pole?

      • ZMANMD

        Front air bag deployment is triggered by front bumper impact, which never occurred, so the car had no reason to try to stop. The fence poles it hit after running off the road are too small to provide enough G forces to trigger air bags in a vehicle as heavy as a Tesla Model S. Also the now dead driver could have slumped down on the pedals keeping the car under partial power without providing any steering guidance. The final impact with a utility pole would have triggered the airbags. The car can never override driver input, even if its a dead driver.

  • Ken

    People who do videos like this should be found and heavily prosecuted.

    • Joseph Dubeau

      You can’t control people behavior. Are they breaking any laws?

      • Bob_Wallace

        They would be in California. People get ticketed for messing with their GPSs while stopped at traffic lights.

        • Joseph Dubeau

          I don’t see the police here in L.A. enforcing traffic laws.
          Enforcement is one thing, but are these people breaking the law?
          Not so long ago I saw a video of a guy sleeping in his Model S during morning rush hour. Did anybody give him a ticket?

          • Ken

            If a cop saw him, he could give him a ticket for reckless driving.

          • Joseph Dubeau

            Have you driven in LA? Driving is a sport. Nobody saying anything is wrong with this behavior. They are showing off on youtube.

            “If a cop saw him, he could give him a ticket for reckless driving.”
            You would hope so. I’ve seen many time people run a red light in front of a cop. Illegal u-turns, speeding, and lane changes with any notices.

          • Ken

            I have certainly seen people holding a cell phone and talking/texting, which is illegal and dangerous and the police doing nothing about it.

          • Joseph Dubeau

            Look, nobody is going chase down a Model S to check make sure they a paying attention to road. If you try to enforce any law, you only make the problem worse. Because all you done is made it cool.

          • Ken

            Police can’t enforce laws everywhere but the Model S auto-pilot is low priority because it is already safer than a human driver.

          • Joe Viocoe

            Let a black or hispanic friend drive you around for a few weeks…. you’ll start to see a bit more enforcement.

      • Ken

        It depends on the state. In Vegas you can get a ticket for putting on lip balm – while just stopped at a light.

        “In Nevada drivers caught with their hands anywhere but on the wheel were liable to be ticketed. [The ordinance] states that when a person is operating a vehicle they must provide full attention to the driving so that it won’t render that action to be unsafe.”

        Since that also agrees with Teslas simple and clear instructions, you wouldn’t have much chance in court even if you said you were in auto-pilot.

        Police everywhere need to be educated about exactly what auto-pilot is so they do not think the above is okay.

        That said, I constantly see people holding and talking on their cell phones and the police do not pull them over even though this causes many thousands of deaths a year.

  • hybridbear

    I’m surprised at the lack of focus on the semi driver who caused the crash. If he hadn’t turned out in front of the Tesla there would be no news story. He should be the one in trouble, not Tesla…

    • Bob_Wallace

      The truck driver is (apparently) in trouble. The Tesla driver was apparently not following instructions and monitoring the road. The collision avoidance system seems to have not detected the trailer.

      Plenty of blame to share.

      • Joe Viocoe

        One broke the law, one broke the law and an agreement, one broke public expectation.

        • Bob_Wallace

          I pretty much agree with you. But there should be no public expectation that the Tesla collision avoidance system is “finished”.

          Tesla has been very clear that the system is a beta version and the road should be constantly monitored by the driver.

          • Joe Viocoe

            “there should be no public expectation that the Tesla collision avoidance system is “finished”

            There SHOULDN’T be, correct. But there is. NHTSA and Tesla both talk about levels of autonomy. But the public simply won’t read past headlines of WSJ, CNN, Fox News, Bloomberg, etc.

            That is why when you read these comments… it is the same misinformation being repeated.

          • Ken

            The truth is no software is ever really finished. Beta is an arbitrary term because “final” versions of software are often full of bugs as any new Apple IOS proves.

            Some company’s “finished” software is not as good as another company’s beta.

            And “collision avoidance” is also an assist, not an absolute. If it was, it would be called “collision prevention”.

            Just like in airplanes – where auto-pilot is supposedly “finished” – it still requires constant monitoring, alertness and the ability to take over at a moment’s notice.

            The finished auto-pilot on Tesla will be exactly the same. It is not the same as autonomous and even early autonomous vehicles will still need to allow humans to take control for both technical and psychological reasons.

        • ha, yeah…

      • Joseph Dubeau

        “Plenty of blame to share.” I would rephrase this statement.
        Plenty to learn here. I think drivers are already too distracted.
        More safety features are always welcome.
        Perhaps we are not ready for autonomous driving.

        • Bob_Wallace

          Our cars aren’t ready for self-driving and this wasn’t a self-driving car.
          Before we will have self-driving cars detection holes like this one have to be plugged.

          I suspect that before cars start driving themselves we’ll see collision avoidance systems working very close to the 100% level. (They won’t be able to protect from crashes caused by other cars at the 99% level.)

          I’m ready for self-driving cars. Coming back from SF last week I had to stop and nap along the way (5-6 hour trip). I’ve got to drive to town and back this evening and the drive back, well, I’d love to turn it over to the car.

          • Joseph Dubeau

            Country roads are going to be their own separate challenge.
            With safety features fully active, I can see something like auto playback would allow your car return to and from your home as well as other places you need to drive.

            Within reason sure, you aren’t going down to Big Sur this evening.
            I don’t like highway 1. I’m sure a Model S owner would turn on the AP.

          • Bob_Wallace

            The Big Sur drive makes my last few miles home look like a boulevard…. ;o)

          • Joseph Dubeau

            You are a braver man than me. I get vertigo.
            I grew up in South Texas which is flat.

          • Bob_Wallace

            You’d love my drive home. I should take a video.

            There’s one stretch – on a California state highway – where the road is a narrow two lane. It’s so narrow that at the edge of the outside white line the road drops a couple hundred feet almost straight down.

            There’s no place to install a guardrail.

          • Ken

            Many Model S owners prefer driving themselves on fun beautiful roads like Highway 1. But if you get into bad, slow traffic the AP comes in very handy.

            But you still must be ready to take over – always.

          • Bob_Wallace

            There are lots of times I’d much rather the car drove me down Big Sur. It can get bumper to bumper there. I’d much rather be able to kick back and look at the view.

          • Ken

            Yes, in bad traffic AP is great.

            I have timed my drives through Big Sur when it is empty and it is awesome.

        • ZMANMD

          We do not yet have autonomous flying let alone autonomous driving. Autonomous driving (or flying) requires all vehicles to participate in a traffic network to manage the traffic flow and coordinate decisions. The Tesla is not an autonomous car, but a car with a very sophisticated cruise control. Treat it as such and you will survive. Assume otherwise at your own peril.

          • Joseph Dubeau

            No not yet, but that’s certainly the expectation.
            For a moment set asides luxury cars and techno toys.
            Lets consider the old and handicap people.
            They need to able go to store, visit a friend and or family member, go to doctors office, pick up their medication from pharmacy, etc. For these people mobility is the most important thing.
            Not performance. Autonomous driving needs to 100% accurate and 200% safer.

            The problem is auto-pilot is not autonomous driving. It’s driver assistance.
            Put a warning label on it is not going to stop people from use it as such.
            It is what it is

      • hybridbear

        I agree. It’s a shame, because this could have been prevented in many ways.
        1) semi driver could have not turned in front of traffic
        2) the Tesla driver could have been better monitoring the road

        However, if 1) hadn’t happened, there would have been no accident. The Tesla driver could have been driving without autopilot & monitoring the road & still have been involved in the collision because of the irresponsible driving of the semi driver. That is the point I was trying to make. It’s not like the Tesla driver is the one who made the dangerous maneuver that caused the crash.

    • Ken

      Sadly, this is a typical move of many truck drivers and regular drivers as well – depending on the on-coming traffic to slow down for them.

  • Bob Smith

    So why was this “technology”allowed on public roads although designed to ignore BASIC SPEED LAWS? Also, since so much emphasis is put on the “130 million accident free miles” how many of those miles were under 30 mph and how many violated speed laws?

    • Ken

      It is important to be able to override the speed limit on many roads because the flow of traffic is often 10 to 20 miles above the posted speed limit.

      Going slower than the flow of traffic is actually as dangerous as speeding. So Tesla designed this part of the system completely correctly.

      • Bob_Wallace

        In this case it looks like the Tesla may have been traveling faster than other traffic.

        I know that we don’t have all the facts but one person claimed that she was driving 85 and the Tesla zoomed past her.

        Tesla’s system is a work in progress. Before they get to a fully autonomous car they need to match the car’s speed to the traffic, not exceed the flow of traffic.

        • Ken

          It didn’t sound like there was much traffic on this road at the time, so it is hard to say what the flow rate was here, meaning the typical speeds people drive.

          Teslas on auto-pilot actually do drive at the speed of the flow of traffic when there is traffic – which is why they have to be able to exceed the speed limit.

          But this guy was definitely abusing many aspects of this tech.

  • WuestenBlitz

    I get it!!! Tesla autopilot is like me sitting I the passenger seat while my wife is driving. Pay attention at all times or die. BUT unlike my wife driving where I can’t easily wrestle the controls from her beautiful, yet scarily inattentive self, with Tesla autopilot I can!

    Glorious!

  • dogphlap dogphlap

    When the Tesla went under the trailer it would have sliced off the forward looking camera, without that I would have thought the car would slow down and stop but it did not. It went on to go through two fences, I would have expected it to stop for those, but it did not (maybe it tried but on grass it could not stop before it had penetrated them at which point it would have no reason to slow again, other than the lack of forward looking camera data). Me thinks Tesla should make a small change to the software so that a car without camera data or with triggered air bags would drop out of traffic aware cruise control and come to a stop via the electric rear P brake. Just as well this was not a heavily populated area.

    • Joe Viocoe

      At such great speed, the 300 yards is easily a momentum carry.

      • dogphlap dogphlap

        Do we know what speed he was travelling at ?
        I know a lady alleges he passed her at high speed while she was doing 85mph but Tesla and the authorities must have a figure from the logging equipment in the car.
        He literally did have a recent record of travelling at close to double the posted speed limit in a 35mph zone so it would not surprise me if he was speeding but as Tesla pointed out the car’s software limits cruise to 5mph above the local speed limit (but weirdly my speed sign reading Tesla often misreads an 80kph sign as 30kph and yet I can still use traffic aware cruise control at 80kph, what is going on there ?). He did run some kind of electronics/software/networking company so with unrestricted access to his car’s computers perhaps he had hacked it in someway to overcome that restriction.
        For myself I tend to stick to the traffic regulations since I like to keep my licence, and keep it clean. It is not a higher morality that sees me sticking to the rules, just base self interest.

    • Ken

      The car’s behavior was completely consistent with any car coming to rest from about 90 miles an hour. There is zero evidence that it was still applying power.

  • neroden

    If you’re driving competently, you ALWAYS have room to stop before any obstacle, even one which “jumps out”.

    Seriously, this is a principle of failsafe driving. It’s actually programmed into train safety systems.

    Sadly, most drivers are reckless, drive too fast, and follow much too closely.

  • Griff

    They need to do what Volvo is doing and require you to keep your hands on the and other “are you paying attention” ques. If you don’t keep your hands on the wheels and the car believes you are distracted from paying attention to the road, it gives an audible warning and deactivated auto pilot mode

    • Joe Viocoe

      Um… Tesla has that too.

      • Griff

        Nope, I’ve been lucky enough to drive a Tesla 70D with auto pilot. If you activate auto pilot, you can remove your hands from the wheel and your feet from the pedals and it will still go. If you grab the wheel or hit one of the pedals, the car gives control back to you. Not in Volvo, you so little as take your hands off the wheel, auto pilot goes off

        • Joe Viocoe

          Volvo may have an immediate “retake the wheel”… other automakers may have a static timer… and Tesla has one based on road conditions.

          • Griff

            The new Volvo S90 and upcoming V90 have the exact same thing as Tesla, they monitor the road and actually also looks around the road. Volvo claims that theirs will work regardless if the road is covered in snow or debris.

            The main point I’m trying to make, is that with Volvo, you must physically keep your hands on the wheel to keep the “assisted driving” (I think that’s what they call it) turned on. Removing your hands, the car automatically believes you’re distracted. Not in Tesla, you’re free to turn auto pilot on and watch Harry Potter on your laptop with your hands completely off the wheel. Auto Pilot only turns off with adverse road conditions, as you previously mentioned. It’s a bad system, look at the guy who was sleeping in traffic….would have been impossible for him to do that with Volvo’s system.

          • Joe Viocoe

            In bumper to bumper traffic… I think sleeping should be allowed. Not much damage at 10 mph. If you’ve got to keep your hands on the wheel, there really isn’t much convenience, and not much point either.

            Volvo is heading in the right direction… and Tesla is already there.
            I don’t think it should be up to the automaker to baby the adult drivers. They are going to push the limits anyway. Even tape objects to the wheel if they can.

          • Griff

            I would agree with you if the technology was further developed, but it’s still new. Currently, I believe those who sleep or are distracted with their Tesla (or any vehicle for that matter) in auto pilot are irresponsible.

            That’s also why Volvo is calling theirs a driving assistant and not an auto pilot mode. It’s technically fully autonomous, but, you still need to have some input. Great for traffic, great for the highway. But you still need to pay attention because stuff happens, deer can jump out in front of you (or in Sweden, moose, which is much worse), people can drive erratically and endanger your life through their stupidity and you as a driver need to be paying attention to avoid injury to yourself and others and be able to take control of the vehicle to avoid an incident if need be..auto pilot or not

          • Joe Viocoe

            Autopilot is NOT misleading for people who know how it is used in aircraft. Pilots aren’t allowed to fall asleep or watch movies either. They cannot leave the cockpit empty, and they fly with co-pilots.
            Tesla doesn’t market directly to the masses… so most non-Tesla owners only go by what the media is saying. And we all know how the media really tells a story. Tesla is upfront about their capabilities, then the media hypes it up to be more sensational and to drum up the natural reactions of fear.
            Very few non-owners even know how Autopilot really works, but they THINK they do, which is dangerous.
            Perhaps there is nothing wrong with the word… but rather the ignorance of the public.

          • Griff

            I agree, those who know how to use it correctly are fine.

            But that still doesn’t mean you can take your eyes off the road. Honestly, an airplane isn’t the best analogy…seeing as how you won’t find deer or other large animals at 26,000ft where auto pilot can be engaged, and the only thing they have to avoid are other planes moving at 500 MPH that are all constantly monitored by air traffic control. And trying to avoid a bird in a plane going 350+ is like trying to dodge a pebble falling off a dump truck in front of you on the highway.

            As I’ve said, I have a fairly decent understanding of Tesla’s AutoPilot and have been lucky enough to try it out. But the same morons that require “Caution: HOT CONTENTS” warnings on cups of coffee are the same reason there need to be greater safeguards on the technology. Tesla’s system works, and works well…people are just stupid lol

          • Bob Smith

            Aircraft don’t operate without oversight from ATC. Your defense is severely flawed. Just like the “tesla autopilot system”.

          • Joe Viocoe

            And automobiles aren’t produced without oversight from NHTSA. The analogy is apt.

          • Only in controlled airspace. Aircraft also has internal traffic avoidance, with transponders in each vehicle.

    • Ken

      You don’t want to do what Volvo does which is try and succeed in running over humans.

      • Griff

        Lol, I think they’ve fixed that. Check out the new system they have in their S90 and upcoming V90

  • M Kashif Bukhari

    It’s beta version yet Tesla charges more than 3000$ for this option…go figure

    • Joe Viocoe

      Um.. no.
      The system is two parts. The “always on”, safety features that are not “in beta”. And the optional convenience part that is “in beta” and require confirmation to activate.
      $3000 gives you the whole thing.

  • traumadog

    One point: Tesla may have 130 million fatality-free miles on Autopilot, but I’d be careful about calling it “safer” without knowing the usage pattern.

    Federal statistics quote a fatality rate of 0.54/100 million miles on urban interstates, vs 1.32/100 million miles on urban streets.

    And per the Model S’s manual:

    “Traffic-Aware Cruise Control is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets.”

    My presumption is that the overwhelming majority of those Autopilot miles are what we would call “highway”.

  • Joe Viocoe

    The lack of side under-ride guards on trucks in the USA is associated with 250 traffic fatalities annually, ignoring pedestrian and bicyclist fatalities in urban areas. It is standard in Europe, and testing has found it is a benefit from a fuel consumption perspective and reduces turbulence from trucks on other road users. Side under-ride guards would almost certainly have prevented the fatal outcome.

    -Cleantechnica

    If this is true… that is a HUGE problem that we should probably address instead of focusing on autopilot.

    • Bob_Wallace

      Can’t settle for a crash avoidance system that allows the car to hit something.

      If the car can’t go under, don’t go there.

      • Joe Viocoe

        The point is that having panelling under the truck trailer, would have been more visible to both the human eye and the radar… thus NOT allowing it to hit it.

        • Bob_Wallace

          Which makes more sense, changing the environment to allow for the inadequacies of the Tesla system or changing the Tesla system so that it can deal with the world as it it?

          • Joe Viocoe

            Both should be changed.

            Updating Tesla’s because they are, by far, the smaller set. They can change much more rapidly.

            But changing the trailers too, because it would benefit ALL passenger cars on the road, not just autonomous.

            See the “Mansfield Bar”.

          • Bob_Wallace

            I can’t put them in the same basket.

            An adequate collision avoidance system should not permit a vehicle to drive into anything. (Of a meaningful size. Crashing into a mosquito would be OK.)

            If the car can’t see a 40 foot long, 13 foot high trailer sitting in its path the system is not ready for prime time.

            If putting siderails on trailers increased safety, for bike riders let’s say, then we should consider requiring it. Adding side skirts to improve fuel efficiency makes sense too.

          • Joe Viocoe

            An adequate collision avoidance system should not permit a vehicle to drive into anything.

            I will say again…
            The point is that having paneling under the truck trailer, would have been more visible to both the human eye and the radar… thus NOT allowing it to hit it.

            BTW, the “not ready for prime time” is a cliche without meaning. What is “prime time” to you? When it is statistically 50% better than a human?

            Think of a Venn diagram of two circles. Overlapping but different sizes.
            Circle A is all the accidents avoided by Autopilot, Circle B is all the accidents avoided by human drivers.
            The overlap section represents accidents that both could avoid.
            This instance could be exclusively within B, and not A… But there are plenty of fatal accidents that are exclusively within A and not B. Since we know that human error is a huge part of unavoided accidents, this area is potentially huge.
            As long as Circle A has the potential to be larger than Circle B,.. And we do know this since Circle B isn’t getting bigger over time (humans can’t evolve as quickly as computer/machine learning)…we should continue to pursue autonomous vehicles.

          • Bob_Wallace

            Putting paneling under the trailer would likely keep Teslas from hitting trailers in the future. But it wouldn’t stop a Tesla from hitting something else sitting about four feet off the ground.

            There’s no excuse to allow a system which does not monitor the entire “box” into which it is about to drive.

            Is the radar simply not focused up high enough? Radar aimed a few degrees higher than level should have picked up the target early. If the angle between the bottom of the signal was not rising rapidly, as it would with a bridge, then the computer should have gone into stop mode.

            There’s a point reached at which the car is close to the last possible moment to brake. An object is in the car’s path. It’s low enough to hit the car. Car needs to stop.

            In this case car did not stop.

            This time it was a trailer. Next time it might be a wind turbine blade. Or a bridge section on wheels. We’re not going to hang Tesla-guards on everything that crosses the road.

            You don’t need to go probability and Venn diagrams with me. I’m highly numbers driven. What I’m doing here is talking about general public acceptance.

            And I’m certainly not saying that we should not move to self-driving autos. I’m just saying Fix the Damn Problems.

            And, probably, turn off the systems (at least at high speeds) until this problem is fixed. (I can explain mathematically why that’s a good idea…. ;o)

          • Joe Viocoe

            Here is the reason why it isn’t as simple as refocusing the radar to get a perfectly informed data set:

            With any sensor package… for any given second, there are anomalies that don’t fit the rest of the data.
            If 90% your sensor readings show clear road ahead, the readings that show otherwise may be discarded as anomalous.
            The question is…. can Tesla simply software tune the logic to accept more potentially anomalous readings? What would that do for the rest of the driving?
            Or does Tesla need to add another set of sensors to clear out some of the noise?

            Simply pointing the radar higher would probably create more false positives too.

            turn off the systems (at least at high speeds) until this problem is fixed. (I can explain mathematically why that’s a good idea…. ;o)

            This sets a precedent that we should react before more information is known.
            More importantly, it stops all learning of the system.
            And what happens when someone is killed by a model S driver who would have been on autopilot, but couldn’t because of this hold?

            This time it was a trailer. Next time it might be a wind turbine blade. Or a bridge section on wheels. We’re not going to hang Tesla-guards on everything that crosses the road.

            Huh? What wind turbine blade swings as low as a Tesla?
            We are talking about tractor trailers. Everything you can think of would be carried on top of one of these trailers anyway.

            250 people die every year from this problem.

            NHTSA is well within their scope to do this, like they did for the Mansfield Bar

            “Jayne Mansfield (and Sam Brody & Ronnie Harrison) did not die in vain. Their death led to the National Highway Traffic Safety Administration requiring all semi truck trailers to be equipped with a DOT Bar. You know it better by it’s other name: the Mansfield Bar.”

          • Bob_Wallace

            Accept a 10% chance you’ll die?

            Sorry. I want my odds to be a lot better. I want the car to see solid objects lower than the car’s roof.

            At this point in time at least alert the human of data conflicts and let them make the decision. Go into crash avoidance mode.

            If you’ve got data conflicts like this happening often then it’s back to the drawing board.

            Tesla could turn off the lane-keeping system at speeds higher than “20” miles an hour but continue to collect data. Keep looking for situations where the human slows, swerves, brakes but the system wouldn’t have.

            Allow lane-keeping only at speeds in which a crash should result in no harm to belted passengers. And require the passengers to be belted even at very low speeds.

          • Joe Viocoe

            Where’d you pull 10% from?

          • Bob_Wallace

            “If 90% your sensor readings show clear road ahead, the readings that show otherwise may be discarded as anomalous.”

            If 10% of my sensors tell me death lurks ahead I do not want to go there.

          • Joe Viocoe

            That is not translated into “a 10% chance you’ll die”.

            That just means that 10% of the readings don’t match up with the rest. NOT that 10% of the readings are sure there is a an obstacle. The 10% could be reading nonsense.

          • Bob_Wallace

            10%, 30%, 50%, 90% – we’ve thrown out numbers is a sort of random fashion.

            Here’s the point, IMO. If you’ve got data coming in from one set of sensors that says shit is about to happen and from one or more set that says “won’t” you pay attention to the set that’s issuing the warning. You do not test to see which is right by driving full speed ahead.

            There was a very large, solid object in front of this car. The car’s system did not stop and let the driver take over.

          • Joe Viocoe

            Not solid, empty from the shoulders down. But that isn’t the point. No part of the sensors said, “shit is about to happen”. It said there was an overhead sign. And if it were to just brake every time it gets that reading… the whole system would not even work as well as it does today.

          • Bob_Wallace

            A sign across the lane, four feet off the pavement?

          • Joe Viocoe

            facepalm*…. no. The sensors cannot tell the exact height of the “sign”, since the computer would have to know the exact curvature of the road surface. It is easy to fool radar like this, at that distance and speed. Which is why Tesla explicitly said that it cannot determine everything.

            There is no such thing as perfect information. The car didn’t have 100% of the information. Neither do human drivers… lots of blind spots, blinking, yawning, tuning the radio, and poor reaction time.
            It miscalculated a window of a few feet, from a distance and high speed.

            Yes better sensors can and will eventually fix the gap… but requiring trailers to have paneling would fix it too, and save the lives in non-autonomous accidents too.

          • Bob_Wallace

            Look at the picture. It’s Florida. The road is flat.

            At 200′ there’s something up ahead that might or might not be a sign. Could be something lower.

            At 100′ whatever it is hasn’t moved up.

            At 50′, whatever it is is level with the windshield wipers.

            At some point between 10′ and 200′ the system should have become “concerned”.

            The system has to be able to “clear the box” before the car enters it. We can’t go around hanging deflectors off everything that has a space under its belly.

          • Joe Viocoe

            Like I said… several times…. the sensors aren’t that accurate to know the difference of a few feet.

            Also, I’ve seen the photos, and I’ve driven in Florida for over a decade… it isn’t that flat. Slightly declining road surfaces are enough to mask a few feet of clearance.

          • Bob_Wallace

            Then Tesla needs to turn off the system and switch over to LIDAR.

          • Joe Viocoe

            Possibly, but LIDAR has drawbacks too.

          • dogphlap dogphlap

            On Google Street View the road has a long straight gentle down hill slope (from the Tesla’s view point) to the point of impact, not a lot but as you say it is Florida. The view is not obscured in any way by trees or anything else and the divider is wide and flat enough that if there were no other alternative you could take to that wide strip of grass with some chance of surviving. Of course to start taking desperate measures you first have to look out the windscreen and be aware of what is about to go down. Applying the foot brake would probably have helped.

          • Bob_Wallace

            Stopping distance of the P85 is 160 feet at 70 MPH.

            http://www.caranddriver.com/reviews/2014-tesla-model-s-60-full-test-review

            If you’re 200 feet from a 40 foot trailer the radar should not be misjudging it to be a sign 17 feet in the air.

          • Joe Viocoe

            It doesn’t matter how wide or long the trailer is… it is about the resolution of the lower edge of the trailer. As the distance closes… the angle of the radar would have to be higher to continue to accurately see that edge… so it really it still doesn’t have perfect information, even up close.

          • Frank

            Elon is a problem solver. This was a Tesla driven by a Tesla fan. I don’t think he could stop himself from wondering how this could be prevented even if he wanted to, and I bet he does, because people are going to ask about this problem and he will want to be able to give a better answer than you get decapitated and die.

            One other thing that is rare, but very dangerous, is hitting a moose at high speed. They weigh a lot, and most of that weight is on long skinny legs at winsheild hight.

          • Bob_Wallace

            You might be surprised how common hitting deer is. Future self-driving cars are probably going to have to be programmed to go into high alert when one deer crosses the road. They are often followed by a second and even a third. (Doe, her yearling, and her fawn.)

            That’s something that people have to learn around here. Often the hard way. One day on the way to town I counted seven fresh kills along the road. They aren’t often fatal for the people in the car but they do total cars.

          • Joe Viocoe

            I look forward to a more connected roadway/vehicle network. Virtual fences that detect large mammals about to approach a roadway, sending out an alert to all drivers on approach.

          • Bob_Wallace

            With our deer the alert would be that deer are in the area. It’s common to see herds feeding along the side of roads in the evening and at night.
            I’m looking forward to cars with IR sensors so they can see warm bodies behind bushes.

            I suspect it will be cheaper to outfit cars than roads. Like the king who wanted to do something good for his people and decided to cover the roads with leather. A sage pointed out that it would be cheaper to cover the bottom of people’s feet.

            And, thus, sandals were born…. ;o)

          • Joe Viocoe

            Made a similar suggestion when people were talking about solar roadways, and its ability to spot obstacles and deer crossing the road. It is much easier to outfits vehicles with heads up displays with infrared overlays.

          • dogphlap dogphlap

            The Tesla is set up to ignore stuff in the upper visual field of the front camera. If it were not it would false trigger on overpass bridges and above road signage. In fact their is an overpass that used to cause my Tesla to decelerate every time I drove that stretch of slightly up hill road before that overpass. That annoyance only happened with an earlier software version so I assume the ignore line has moved down a little lower with the current version. The fatal Tesla was driving on a slight down slope so couple that with the fact the centre section of the trailer was high the camera almost certainly saw it but chose to ignore it since it was set up to assume anything that high was an overpass or overhead highway signage. We know from the Tesla hitting the trailer during a Summon event the car will not stop for stuff just below roof height, approaching a trailer from a downhill slope would just make things worse.
            I don’t buy the poor lighting story. The car was coming from the west, going east which means at 3-4pm the sun would be behind the Tesla, ideal lighting conditions really to see even a white trailer against the sky and the Tesla was on a slight down-slope so their is some possibility the trailer would be set against the darker background of the road surface, either way I’d bet money the camera saw the trailer but just ignored it as being too high to present a danger.
            Side impact guards would go a long way to reducing that 250 deaths a year from vehicles going under trailers and give the semi autonomous vehicles a close to 100% chance of realising that that is a stationary vehicle ahead and not an overhead sign.

        • neroden

          Sideguards save the lives of bicyclists in truck-bicycle crashes. The bicycle is never going to have automatic collision avoidance. The truck… well, let’s just say collision avoidance tech isn’t good enough to stop all crashes yet.

          So sideguards should be required. Period. They’re a cheap fix which saves a lot of lives.

  • Outcast_Searcher

    This is only somewhat worse than what I saw in various videos of Youtube Tesla owners doing on crowded, dangerous, undivided roads with stop lights, pedestrians, curbs, curves, etc. as far as paying attention to the road, or keeping their hands and feet ready to react when Autopilot stumbled. And yes, Autopilot stumbled several to many times per hour of such video I watched. (They were making demo videos of what Autopilot is like).

    Hopefully regulators will notice how people actually behave and strictly ban such testing on public roads for the general public.

    And don’t get me wrong, I’m no luddite. I welcome this technology and hope to have a safe fully automated car within 20 years, so I can maintain my mobility as I age.

    OTOH, this technology seems to provoke behavior like texting on steroids, so I don’t want it used by untrained/inattentive people on public roads.

    • Joe Viocoe

      I’d prefer if people like this never got their license… but in the US, we are more afraid of technology than we are of aggressive teenagers.

      Hopefully regulators would suspend more human driver’s licenses so they cannot even get behind the wheel of any car.

      • neroden

        I agree. The person who got himself killed in this crash had a record of speeding, including a ticket for driving 64 mph in a 35 mph zone. He should have had his license permanently revoked some years ago.

        • dogphlap dogphlap

          Interesting.

  • Bob_Wallace

    Is there any indication that the car attempted to stop itself?

    A tractor trailer does not make a left turn at high speed.

    • ZMANMD

      That remains to be seen. If the system is a typical traffic following system like all the rest (Subaru, Honda, Mercedes, etc), it saw the tractor pass across and assumed a vehicle had crossed the lanes and continued. It may have seen the trailer wheels 50 feet behind the tractor and assumed it was a different vehicle. If the car was travelling as fast as the witness indicated (85+ mph) it would need close to 300 feet to stop once an object is detected. Assuming it saw the tractor pass across while over 500 feet away and the wheels not yet in the road it should have continued on. At 10 mph, the trailer needed nearly four seconds to clear the road after the tractor had passed. Since the trailer wheels never entered its lane, the tractor must have been moving slowly across the road while the car was covering over 130 feet a second, covering over 500 feet in those 4 seconds.
      The auto-pilot system apparently cannot detect objects above the road, only objects on the road. This is further indicated by the recent accident where a man summoned his Tesla and it rolled forward into a semi-trailer that was parked in front of it, striking its roof on the trailer hitch area. These two incidents combined indicate that objects above the road but not in contact with it are not captured by the camera nor the radar/laser scanners. Clearly this scenario is beyond the capability of all current collision avoidance systems, but the driver should have seen the trailer over the road for over 5 seconds and slammed on the brakes.

      • Bob_Wallace

        If that’s so then the car’s radar can’t see objects higher than the hood.
        Inadequate.

        • Bob_Wallace

          Camera sees objects higher than the hood.

          It reads highway signs. At least speed signs.

  • flashrob

    this AUTOPILOT stuff… is completely RIDICULOUS…

    ever notice “computer gliches” build up IN YOUR LAPTOP… eventually you got to defrag…reboot…

    the “bottom line” NO SOFTWARE or computer IS NOT GOING TO “MALFUNCTION” at some point…AND THAT’S NOT COUNTING… “THE USER CAUSING THE GLICHE OR PROBLEM…

    this is a “whole bunch of hype nonsense” …FOR THE ENTIRE “AUTO INDUSTRY AND COMPUTER SOFTWARE INDUSTRY” TO EVEN START DOING THIS “BECAUSE OF PROFIT-POTENTIAL DRIVING THOSE TWO BIG INDUSTRIES…

    all you need is “sunspots” …some other big “electromagnetic hotspot”…and GLICHE goes the computer…

    it’s one thing if the “computerized processors” in a car that regulate things like “ignitiion” malfunction termporatily or get fried…

    but something LIKE “AUTOPILOT” DRIVING YOUR CAR AT SPEED…

    THIS IS ALL “SO DUMB”…cruise control IS BAD ENOUGH… I USED IT TWICE IN TEN YEARS…and it proved more of a distraction…

    plus the “litigation” figuring who/what/how much…the driver or the system was at fault…

    look at “plane crashes” like the Eastern Airlines L1011 crash in the Florida in 1972… because the crew was “so engaged with a warning light” THEY FAILED TO NOTICE THE “AUTOPILOT DISENGAGED”

    SO…ALL THIS “MAKE IT EASIER STUFF” …is NOT A SOLUTION or the WAY TO GO…in all situations…

    especially…if the “average person” can’t drive …without AUTOPILOT… you gonna let them because they got autopilot… ARE THERE “continous feedback road warning sensors on our highways???

    when I first heard about AUTOPILOT technology…the thinking was… highways would have “sensors, monitors, feedback and CONTROL systems INTEGRATED with “some kind of AUTOPILOT” TECH in the car…

    all we have today IS SOME PRETTY SOPHISTICATED SOFTWARE… that is bound to fail… because “driver is going down a ONE WAY ST… the wrong way… DOES IT COVER THAT… can sensors read “someone running a red light”… or a child dashing out right in front of you FROM BETWEEN PARKED CARS…

    I can see “added safety features” like RANGE AUTO BRAKING…but THIS KIND OF STUFF… MEANS

    ALL THIS “AUTOPILOT STUFF” IS “ONLY” A “BACKUP” …not to be depended upon…

    all this “autopilot stuff” is ONLY THAT …AND WILL BE LIKELY THAT FOR DECADES YET…

    you don’t let CARS DRIVE THEMSELVES… even if you can “program one to do that” under IDEAL CONDITIONS…
    …one glitch in the software… one burned out circuit…or hotspot electromagnetic pulse…

    AND “WILD CAR” with or without driver…

    and HOW DO YOU LIGTIGATE SOMETHING LIKE THAT…could be in the courts for YEARS…in each case…

    so the GUY GOT AUTOBRAKING and car shuts off it is hands release steering wheel… like “old person” having stroke or heart attack…THAT’S FINE…

    IT’S AN “ADDITIONAL SAFETY FEATURE THAT CAN BE USEFUL”… BUT PEOPLE THINK THEY CAN PUT THE CAR ON “AUTOPILOT” AND TAKE A KNAP…while behind the wheel…

    give me a break… ARE YOU PEOPLE GOING “LOONEY-TUNES!”

    • Bob_Wallace

      You might want to get your keyboard repaired and try retyping your rant….

  • ZMANMD

    Wow! A “driver” engages a speed control software with lane keeping assist and collision “mitigation”. The software is engaged at well above the speed limit and he stops looking at the road to watch a movie. So somehow the car (and its manufacturer) is responsible the resulting crash and death of said driver. A 40 year old dying to see Harry Potter is bad enough. Making everyone and everything else responsible for his extreme lack of judgement is the definition of “ludicrous”. I am a licensed pilot and I watch the gauges and outside while auto pilot is engaged. There are too many things that can go wrong the software was not designed to account for. So how can this car and by extension its manufacturer be all knowing and account for the truck turning across all lanes in the scenario mentioned above. The most likely scenario is the car was going too fast and the truck too close for even a human to prevent a crash. Also, everyone is assuming the auto-pilot was still engaged after the crash, but it was not. The proof is in the car veering off the road into a field, hitting a fence and a pole. Auto pilot would have keep the car on the road in its lane. So either the driver hit the brakes just before impact and disengaged the auto-pilot or the auto-pilot self disengaged just before impact. Once it was disengaged and the crash occurred, the now dead “driver” was back in charge, thus the car randomly coasting across a field and hitting a pole. Sure a lawyer can argue the definition of “is” in court, but this is a clear case of driver error.

  • mkight

    I keep seeing the 130 million mile number that Tesla touts, could someone tell me how many miles were driven at night? in heavy fog? heavy rain? Snow? ice storms? in rush hour traffic? How many of the 130 million hours were driven on safe driving courses during the middle of day with near perfect driving conditions?

    I love technology but I had to restart my 9 month old phone twice in the last week. I have to do hard resets of my 11 month old Windows 10 laptop 3 times in the past 2 weeks. Many people in this thread seem to think humans shouldn’t be driving cars because they aren’t reliable yet these same people are willing to trust an autonomous driving car programmed by the same people.

    Lastly, given that this was Tesla’s fault will the dead man’s family and the truck driver as well as the man whose yard the card ended up in be able to sue Tesla?

    • Joe Viocoe

      It is no where near “given” that this was Tesla’s fault. The driver is the first person responsible, for violating the terms. Next in line is the truck driver for failing to yield.

      Tesla DOES have the statistics of “How many miles”… Guess what, it is the SAME amount of miles a typical sedan driver would drive in fog, rain, snow, ice, and traffic.
      And ZERO MILES on some “safe driving course”.
      These are real world, typical miles driven.

      Congratulations on you phone. And your Windows laptop.
      Personal computers/smartphones aren’t designed with reliability as the top priority. They are designed for interoperability with applications from many other companies. You can install dozens of apps, and even have 3rd party bloatware already installed. All this causes instability.
      In critical systems… they usually run straight linux or unix, don’t allow any 3rd party software, and only perform a set task. These systems run missile defense, satellites, aircraft, etc. These are computers that don’t need to be rebooted for years sometimes, unless scheduled for updates.

      • Bob_Wallace

        The truck driver may or may not have made an illegal turn.

        The Tesla failed to stop, it apparently didn’t even attempt to stop. The Tesla collision avoidance system failed to detect the truck and respond. Let’s try to not find excuses for the collision avoidance system. It’s got at least one hole that needs to be fixed.

        • Joe Viocoe

          It certainly does need to be updated. But Tesla knows that it isn’t perfect, so it requires a driver to pay attention. Which is why Tesla is 3rd responsible.
          It was posted elsewhere Florida law requires vehicles to yield when making a left turn. Putting the driver of the truck as a moving violation at the very least.

          • Bob_Wallace

            I don’t know if Tesla would be found partially responsible in court. They clearly inform users that they must keep an eye on the road.

        • Ken

          It says right in the instructions that the Tesla system both may not stop or even see certain objects in front of it so the system worked exactly as stated it would.

          The main fault is the driver watching a movie while also speeding which the autopilot instructions clearly state the system was not designed to do.

          • Bob_Wallace

            There are two issues here (for me).

            1) Who is responsible. Based on what I know right now I give both drivers a share of responsibility.

            2) How well does the Tesla collision avoidance system work. Not well enough.

          • Ken

            There are certain lighting situations where no system will see an object. The conditions are stated in the Tesla instructions. This, sadly, was one of them.

            There are many, many warnings. Here are just a few:

            “Warning: Do not depend on Traffic-Aware Cruise Control to adequately and appropriately slow down Model S. Always watch the road in front of you and stay prepared to brake at all times. Traffic-Aware Cruise Control does not eliminate the need to apply the brakes as needed, even at slow speeds.”

            “Warning: Traffic-Aware Cruise Control can not detect all objects and may not detect a stationary vehicle or other object in the lane of travel. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.”

            “Warning: Many unforeseen circumstances can impair the operation of Traffic-Aware Cruise Control. Always keep this in mind and remember that as a result, Traffic-Aware Cruise Control may not slow down or may brake or accelerate Model S inappropriately. Always drive attentively and be prepared to take immediate action.”

            There are many more but the above basically predicts this sad scenario if not followed. It’s very simple, you still have to always be ready to take over.

            I agree that the truck was in the wrong too but the Tesla driver was speeding and, most egregious, not even watching the road but a movie instead, so I think he gets most of the blame.

          • Outcast_Searcher

            So given all those warnings and limitations and human nature, why in the world did Tesla permit a system clearly so short of ready for safe autonomous driving:

            1). Into the hands of the public in BETA mode?

            2). Called “Autopilot”, which is a misleading marketing term, until it is certified safe?

            It will be interesting to see what happens when regulations come out about automated driving, and how much such things are limited. Oh, and Tesla hurrying to do this before regulations are even produced is a way to try to jump ahead of the competition, but a very bad way to demonstrate true concern for public safety.

          • Ken

            Wrong. It is extremely easy to follow the instructions and auto-pilot is a vast superior driving experience that has already proven to be twice as safe as a human driver.

            The amount of lives saved and fatalities already avoided have proven it has been an excellent choice to roll this out.

            It is you idea that is very bad because it would have resulted in more death and injury.

            Try and catch up.

          • Bob Smith

            Twice as safe as a human driver? B.S.. Probably twice as safe as you driving. I say the claim of 130 MILLION miles without an accident is bullshit.

          • Bob_Wallace

            The data is there which makes your claim bullshit.

          • Bob Smith

            Show it. I bet that the majority of miles was under 30 mph. That does not prove that it is ready for use. To hang your hat on “130 million miles UNDER 30mph” makes you the fool.

          • Bob_Wallace

            I can’t show you the data. Tesla has the data.

            Tesla has reported that this is the first fatality after 130 million miles of autopilot. You are free to believe that or not.

            You are not free to call other people names on this site.

          • Bob_Wallace

            The data is there. 130 million miles without a fatal accident.

            IIRC there was a single minor fender bender that has been reported. A bus made an unexpected turn and the Tesla fender was damaged.

          • Ken

            Saying BS to actual data while showing none of your own proves you both ignorant and dishonest.

            Also no one ever said that Teslas went 130 miles without an accident proving your reading comprehension is extremely low.

            You need to stop being so angry and panicked about new, superior tech.

          • Joe Viocoe

            “Not well enough.”

            Can you quantify how much would be enough?
            Most people cannot, and assume that anything less than perfect isn’t enough.

          • Bob_Wallace

            I understand data. I’ll take a system that’s 50% better than the option. But that’s not how a lot of people operate.

            “The Tesla smacked into a tractor trailer rig sitting in the middle of the highway! Ain’t no way I’m that bad a driver. Biggest thing I’ve hit lately is a mule.”

            Tesla’s full auto-driving system is not likely to be accepted until we see no more of this type crash.

            Safe falls from upper story, lands 5 feet in front of Tesla, Tesla hits it.

            That sort of incident wouldn’t put a lot of people off the system. But a tractor trailer apparently not seen at all. (No indication that brakes were ever applied.)

          • Joe Viocoe

            Pragmatic. 50% is a fair number. I was at 30% myself.

            The “ain’t no way I’m that bad” is the problem with human ego. They never think it could happen to them, until it does.

            Tesla’s full auto-driving system is not likely to be accepted until we see no more of this type crash.
            I disagree and agree.
            The media frenzy around Tesla’s 1st ____, as it always does. Then we won’t “see” this type of crash again, even though it will happen.

            I myself am holding on to pictures of a burned up Leaf that I saw towed in my neighborhood. Which is a remnant of the time when people thought EVs were more prone to fire. This too, shall pass.

          • nitpicker357

            That autopilot failed to recognize a hazard which would be obvious to any qualified, alert, human driver doesn’t say much about whether it is a more or less safe driver than an unaided human. It does indicate that it isn’t a perfectly safe driver, but I knew that. Didn’t you?
            Remember the old Apple “Think different” ad campaign? One thing to watch out for is that computer programs often think very differently than people (even their programmers).
            I read … somewhere … that the radar picked up the truck, but misclassified it as a sign above the road. Perhaps it relied on the cameras to calculate its actual height, and the cameras couldn’t find it at all?
            If Autopilot can be shown to be safer than the average driver, (hint: it can’t, yet) then it is time to start requiring autonomous capability on all new vehicles, especially if autonomous capability is improving. It doesn’t matter if the mistakes made by autonomous vehicles aren’t the same mistakes a human driver would make.
            What I’ve read suggests that there is some low-hanging fruit for improving Autopilot, but that is based on a LOT of supposition.

          • Bob_Wallace

            “If Autopilot can be shown to be safer than the average driver, (hint: it can’t, yet)”

            We don’t know that. We’ve got one single death which is not enough data points to determine the safety level of the system.

            We could look at non-fatal accidents and get a better idea but we don’t have that data.

            Musk has said that the system was twice as good as humans (something like that) but I’m not sure what data he’s using.

            Fact is, this hole in the system might be the only hole, period. Fix this one and there might never be another fatality while the system was driving. Or there could be lots of yet to be discovered holes.

            There’s a whole bunch of stuff we don’t yet know. We’re (some of us anyway) giving the truck driver partial blame for making the turn across traffic. But suppose the Tesla had been in the far right lane and coming up very fast behind a larger vehicle traveling at a lower speed. The truck driver starts his turn based on his (correct) judgement that he’s leaving the vehicle he sees plenty of room. He gets part way through the turn, the Tesla driver flicks the turn signal, goes back to his movie, the Tesla quickly swings around the large, slower vehicle and on into the truck.
            Not saying that happened. Have no idea. Just suggesting that that we don’t know enough to really figure out what happened.

          • nitpicker357

            “We’ve got one single death which is not enough data points to determine the safety level of the system.”
            That was my main point. Are we agreeing? If Autopilot had logged 10 times the average miles/fatality with only one fatality, it would be highly unlikely that it was as bad as the average driver.
            I don’t think Autopilot has the same use profile as a typical driver; it certainly doesn’t if people have been following the instructions in the manual. That may warp the statistics.
            My secondary point, restated, is that Autopilot’s accident profile may be very different from a human driver’s accident profile. It could deal with machine-like perfection with situations it assesses correctly, but seem utterly blind in situations it assesses incorrectly. This sad incident seems a good example of that blindness.

          • WuestenBlitz

            And the collision avoidance system inherent in my grey matter will also fail. Whether it be my toe against a random chair or curb, or hopefully not me vs semi truck again (once was enough for me!). Autonomy while driving is about a MORE safe system than my own. Currently the BETA for Tesla is supposed to be a supplement to the driver, an assist mechanism just as cruise control is. I, the driver, am the final failsafe in the vehicle. Both machines in the accident were guided there by the driver, and both drivers made a mistake that cost a life. It was not the hardware or the software in their vehicles that caused each of them to abuse the intended purpose of their vehicle and how it should safely be driven.

          • Bob_Wallace

            The Tesla collision avoidance system did not detect the tractor trailer truck that was turning in front of it.

            That has to be fixed.

          • WuestenBlitz

            Yep, you can be assured that Tesla is working on that. Point is that our extremely spacially aware and overly paranoid brains in all their glory still make mistakes. The Tesla sensors missed the truck, but so did the sensors on the exponentially more powerful human brain. There will always be crazy situations that are just bug house nutz, one in a trillion happenings. There will be failure, so to set the expectation of perfection on yourself, others, or any software is just ludicrous! (BA da Tsch!)

          • Bob_Wallace

            Human brain sensors seem to have been aimed elsewhere.

            You really think a person would have not seen a tractor trailer across the road under those visability conditions?

          • WuestenBlitz

            Yes, because that’s exactly what the data is showing. It is speculation that he was watching a movie and not the road. The fact is neither the Tesla, nor the driver reacted to the Semi pulling in front of the vehicle. What will be important is to know the speed of the Tesla, sun location in relation to the car, and the trucks visible location to the driver. I have had vehicles and pedestrians suddenly pop out from behind pillar blind pots before because their speed and orientation to mine kept them hidden until the last moment. As for Autopilot I would bet that the speed of the Tesla was just right for the radar and camera to “not see” the cab of the semi as it crossed the road, nor see the rear tires of the trailer either. It will be interesting to find out just how perfect the timing had to have been to fool the driver and Autopilot. It’s just the law of big numbers. The more time you have, the more likely the unlikely becomes.

          • Bob Smith

            Your brain might make mistakes, mine doesn’t. You should recuse yourself from further driving. Also, please stop texting while driving.

          • Bob Smith

            Then whoever offered this for commercial use is responsible.

          • Bob_Wallace

            No. Tesla was very clear that this system is not a autonomous driving system. Drivers are clearly instructed to keep their eyes on the road. There is no guarantee that the system will detect every potential danger.

          • Ken

            Wrong. All cars intended uses are flagrant violated by owners. Speeding, driving drunk. The car makers are not held responsible for idiot behavior.

            Try and catch up.

      • nitpicker357

        Do you have a citation for that? It is surprising, given the emphasis Tesla gives to the notion that Autopilot is only supposed to be used on highways and the like.

    • Ken

      Wrong. This wasn’t in any way Tesla’s fault just like idiot behavior and flagrantly violating instructions is not the fault of other car companies.

      • Bob Smith

        You are confused. He was relying on “self driving software” that was designed to purposely ignore the speed limit which somehow appears on GPS, yet was ignored by the software designed to pilot the car. I would say that this goes way past negligence.

        • Bob_Wallace

          You don’t have your facts straight. That makes your opinion invalid.

        • Ken

          Completely wrong. I am confused about nothing. The driver completely chose and controlled the maximum speed – not the car. That makes the driver at fault.

          The driver also flagrantly violated instructions by not watching the road but a movie instead. That makes the driver at fault.

          Try and catch up.

    • neroden

      I have to do hard reboots of my Tesla Model S roughly weekly.

  • Richard Frank

    they should not be allowed to use the term “auto-pilot”. That term means to most people what it sounds like – go to sleep let the car take over. Even if they put up a warning, people will treat that like the software license agreements no one ever reads. Stop calling it ” auto-pilot”. It’s not a commercial jet.

    • Ken

      Wrong. Auto-pilots on jets require constant oversight exactly like the Tesla so “auto-pilot” is a completely accurate term.

      • Richard Frank

        Okay then public perception ( like mine ) is that that is what it means – and since most people aren’t trained pilots they won’t understand the specific limitations. Hence it should be called something else.

        • Outcast_Searcher

          Obviously. Frank is clearly showing Tesla Fanboi bias when you look at his posts overall.

        • Ken

          Wrong. It is up to any member of the public to learn what words actually mean. It is also required for people to follow instructions and warnings clearly and repeatedly given by the car itself and the manual.

          There will always be stupid people doing stupid things. That will never change

        • Joe Viocoe

          Um… the distinction is made when the driver tries to enable the system… and is told how it works. And is told not to let the attention wander.

          Calling it something different isn’t going to fix stupid.

      • Bob Smith

        Aircraft don’t operate in close proximity to other aircraft and are under constant surveillance by ATC. don’t compare apples with oranges.

        • Ken

          Wrong. Since aircraft travel at much higher speeds, the reaction times for problems need to be very similar.

          The Tesla auto-pilot follows the definition of auto-pilot proving you are wrong – not Tesla.

          Stop trying to talk about things you clearly have zero understanding of and clearly also a huge, panicked fear of.

  • Jakob Stagg

    What’s scary is the predictions that autonomous cars will be forced on us. Then they can’t decide if we will be allowed to drive them or not. All I can say is the whole friggen thing should blow up so the victim won’t have to suffer when something goes wrong. Imagine being trapped in this rolling coffin as you see that you are about to die. Fun!

    • Joe Viocoe

      Horrible comment… and just a “they’re coming for us” conspiracy theory.

      • Shane 2

        Hussein is coming for our guns and is going to make us live in hobbit homes before he reveals himself to the anti-Christ!

      • Jakob Stagg

        I am sharing thoughts from the mindless patter of those who are talking about autonomous vehicles. As far as conspiracy, each area where this development is occurring are predicting different ideas. All are sufficient there is no thought going on in the process. The problems in technology, law, and insurance are good indicators how bad the idea is.

        • Ken

          It is you that has the “mindless patter”.

          Humans are terrible drivers. There is not a single credible source anywhere that is saying self driving cars will be better and safer.

          There are currently 33,000 deaths a year from traffic accidents. There is no credible source not saying that self driving cars will cut that number to a tiny fraction of that.

          Try and catch up.

          • Jakob Stagg

            I agree humans are terrible drivers. The greatest problem is humans are poor learners. They would rather die than think or learn skills.

            Auto related deaths has increased in recent years. Vehicles have become distraction devices.

    • Ken

      You are already much more likely to die from human drivers than this auto-pilot system.

      • Jakob Stagg

        Not really. The only way anyone will get me in one of those things is if it arrives to haul my remains to the crematorium. Nobody will be listening to my objections then.

        • Joe Viocoe

          Please… nobody is “coming for you”. Meanwhile, you have no regard for the 30,000 families that are affected by humans killing and dying each year. You need to get over your technophobia and realize that the problem is us.

          • Jakob Stagg

            Coming for me? Why would they? The government enables carnage, not provide solutions to prevent it.

            Technophobia? I only embrace things that actually work.

        • Ken

          Just saying “not really” is a total fail.

          The data proves you completely wrong.

          Stop panicking and be so afraid of something that is much safer than humans ever will be.

          • Jakob Stagg

            Data proves nothing. There is not enough to analyze. Plus, when honest statistical analysis actually occurs, it will evolve from conclusions from the people who paid for it.

      • Patrick Killian

        There really isn’t enough data to make this claim. Especially since the numbers on Autopilot (which is mainly used on divided highways) are compared against the numbers used for all driving scenarios- the samples aren’t representative of the same thing so comparing the numbers is fairly meaningless

        • Ken

          Wrong. There is already much more data on auto-pilot than insurance companies use to accurately predict safety of new cars and systems everyday.

          There was actually much less data used to predict the vast superiority of Teslas with regard to fire danger years ago and that data proved completely true.

          • Patrick Killian

            Safety of new cars in crashes is far differentthann evaluatingnthe abilities of autopilot. I believe autopilot is likely safer than a manual driver under many conditions but there is not actual significant data to support this- no matter how much you repeat the same quote about it being “more data than insurance companies use”. The fact of the matter is Tesla is comparing it’s data to data that represents something different so they can’t be accurately compared- that’s just plain and simple math and you can’t ignore it with a vague statement about the data being more than insurance companies use and therefore somehow relevant

          • Ken

            Wrong. 130 million miles actual and almost a billion miles passive is more than enough data to make the claim Tesla is making.

            Tesla is a credible source.

            You are making claims with zero evidence and clearly are not a credible source so no matter how many times you simply try to make things up, you will fail.

          • Bob_Wallace

            No, Ken. The one fatality per 130 million miles for Tesla is a single data point. We need many more data points in order to determine a real mean.
            And these 130 million miles were likely mostly divided highway miles (those are Tesla’s instructions as to when to deploy). The one fatality per 90 million miles for other cars includes a lot of miles driven on roads other than divided highways.

          • Ken

            Tesla and the entire insurance industry disagrees with you. They are the experts here.

            The amount of data already compiled by Tesla is much, much more than the insurance industry uses to access risk for new cars and systems and the insurance industry is very successful at this and, hence, one of the most profitable businesses in the world.

            Much less data was used to prove the Tesla’s vastly superior fire safety over ICE cars years ago and it has shown to be completely accurate even though I went through many arguments that “there wasn’t enough data”.

            As Musk has said, there is no question here if you do the math.

            “If anyone bothered to do the math (obviously, you did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”

            I’m gonna have to go with experts on this one.

          • Bob_Wallace

            Ken, before telling me that the insurance industry disagrees with me find some data. What we have right now is Ken disagreeing with me and I suspect there are things he doesn’t know about statistics.

            Do this, Ken, find the number of fatalities per million miles on divided highways that are not due to driver drinking, texting, etc. and then we can start to compare the effectiveness of Tesla’s collision avoidance system vs. unimpaired human drivers.

            It makes no sense to use one database that includes driving on more dangerous roads and includes impaired humans with another database that includes no impaired humans and only the safest of roads if we’re trying to answer the question of whether the Tesla system is best at spotting problems like trailers across the lane.

          • Ken

            Since you are the one disagreeing with Tesla, I think you should find the data you think proves them wrong or dishonest.

            That would be a pretty big deal if they were, don’t you think?

            They are not saying anything that “makes no sense”.

            Also, if you read the Tesla forums, you will see many people use auto-pilot on other than divided roads – including me.

            You should also understand that one death in 130 miles is not a single data point. There were many possibly fatal accidents avoided within that 130 million miles which were also data points.

            The statement is that auto-pilot, as is being used in the real world right now, is safer than a human driver and that is exactly what the data proves.

            In fact, if you eliminate flagrant misuse of the system, there have been exactly zero deaths involving auto-pilot.

            As for the insurance industry, they set insurance rates on Teslas with much, much less than 130 million miles of data. That is a simple fact. They would not be able to do that unless they could make accurate predictions with many fewer miles driven. It is done all the time.

          • Bob_Wallace

            I’m not disagreeing with Tesla, Ken. I’m pointing out that you are misusing Tesla’s data.

            ” many possibly fatal accidents avoided within that 130 million miles which were also data points”

            We don’t have that data. We also don’t know how many close calls human drivers kept from turning into fatalities.

            “As for the insurance industry, they set insurance rates on Teslas with much, much less than 130 million miles of data.”

            Is there a cheaper rate for Teslas that have the autopilot system? I’ve never heard of an autopilot discount.

          • Ken

            If you are not disagreeing with Tesla, then you are not disagreeing with me.

            Tesla and Musk have clearly said that auto-pilot has already proved to be safer than a human driver, exactly as I have been saying.

          • Bob_Wallace

            Ken, you are not grasping the issues.

            But let’s do this. Let’s declare you the winner of this contest you’ve created so that I can quit trying to explain the subtleties of the discussion to you. You’re wasting my time.

          • Ken

            I don’t consider it a contest. We agree on most things and I enjoy talking with you. Your responses are well thought out and I understand what you are trying to say.

            I am just pointing out, that when I post about auto-pilot, I am simply agreeing with Tesla.

            No worries.

          • Patrick

            Bob I did the same- Ken doesn’t seem to understand that the claim that Tesla is making about auto pilot being safer simply isn’t backed up by statistically significant data and believes that because “Musk said it” that makes it true. I tried my best to point out the logical fallacies but got told instead that I don’t understand and “Tesla said it” so unless I can prove they are lying I am obviously the one in the wrong. I give up =P

          • Ken

            Wrong. It was actually you who has the logical fallacies.

            You are completely wrong about how Tesla uses passive data. It shows what auto-pilot would or wouldn’t do in all driving situations. Try doing some research before making statements about things you don’t know about or understand.

            You are also wrong that it is because Musk said it that makes it true, it’s because I hold multiple degrees in physics and economics and have dealt with insurance probabilities on new system designs.

            I guess you need to contact Musk and explain he doesn’t understand his own data, if you can catch him in between space launches.

          • Bob_Wallace

            Ken, can you explain how one can meaningfully compare these two numbers? Or can you tell us why they can’t be compared?

            1) Tesla has recorded one fatality in 130 million miles of autopilot assisted driving.

            2) Non-Tesla cars in the US average one fatality in 90 million miles of driving.

          • Ken

            Again, I point to Musk’s own statement:

            Musk: “If anyone bothered to do the math (obviously, Fotune did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”

            If you think he is wrong or dishonestly misrepresenting something you should show how.

            Since Musk has seen close to 1 billion miles of Tesla auto-pilot data both active and passive, I happen to agree with him.

            Musk is able to compare accident and injury rates between Teslas using auto-pilot and Teslas not using auto-pilot.

            And because this death was caused by not using the system correctly, it was not even the system’s fault.

            Again, Musk made the same claims about Teslas being much safer than gasoline cars in terms of fire danger when many (wrongly) claimed that there was not enough data – and Musk was proved right, though actually quite conservative, in the Tesla’s superiority.

          • Bob_Wallace

            Thanks, Ken.

            Now that I’m certain you don’t understand the issue I can ignore you and move on.

          • Ken

            There is no need to be insulting, Bob.

            You are accusing Elon Musk of not understanding the issue about his own car – not just me.

            I and many people happen to think Musk is quite intelligent so it could very well be you that doesn’t understand.

            You have yet to show a convincing argument that shows Musk is wrong.

          • Patrick Killian

            Tesla has plenty of data on its own cars. The issue is the comparison to non-autopiloted cars is not an equal comparison. Just because you don’t understand basic logic and statistics doesn’t make me wrong. Passive data gives us zero info about the safety of autopilot because it isn’t isn’t in use so that isn’t useful in this comparison. Tesla says from their autopilot data, they’ve seen 1 fatality. However autopilot is primarily used in situations where the chance of fatality is fairly low. Tesla then compares their autopilot data against data from ALL miles driven in all types of situations (bad weather, unlit back roads, etc). For a fair and meaningful comparison, the data has to represent the same thing. Tesla’s comparison would be valid if they had statistics for average cars for the same set of data (driving on divided highways in good weather). However since the 2 data sets are from vastly different situations the comparison is meaningless. It would be like saying that a shotgun is “more accurate” than a handgun because it hits its target more often. Since they are often used to fire birdshot, they are more likely to hit the intended target so the comparison doesn’t make sense. Same situation when Tesla compares its autopilot data to data from the NHTSA (which encompasses all weather conditions, all types of roads, etc)

          • Ken

            Wrong. It is you that doesn’t understand basic logic and statistics as Musk points out.

            He has some good advice for you: “If anyone bothered to do the math (obviously, Fotune did not) they would realize that of the over 1M auto deaths per year worldwide, approximately half a million people would have been saved if the Tesla autopilot was universally available. Please, take 5 mins and do the bloody math before you write an article that misleads the public.”

            Who to believe? Not really a contest, is it?

    • Outcast_Searcher

      Hint: No way will they be forced on anyone until they have an established safety record that makes humans look like maniacs. Clearly there is a lot of work to do in both the hardware and software before such systems are anything approaching such safety.

      Credible estimates are talking decades before such cars are a significant minority of those on the road — much less being forced on all of us.

      • Jakob Stagg

        Hint: Government brings about change in only two ways: Threats or force. The pressure is on. There’s nothing but opportunities to make money, tax things, or infinitely regulate and control.

        • Joe Viocoe

          Please shut up with your anti-government rhetoric. The Internet is for people who realize the government helped to build it.

          • Jakob Stagg

            Governments do not help anything or anyone. They have hindered progress for millennia.

          • Bob_Wallace

            Jakob, please take your stupid statements elsewhere.

          • Joe Viocoe

            Please return to your Trump rally.

  • Shane 2

    “The reports below indicate that he wasn’t working, but rather watching Harry Potter. ”
    Given the level of destruction, how would anyone know this sort of detail? My BS detector is sounding.

    • Joe Viocoe

      Read the article before you consult your BS detector. The movie was still playing when the truck driver went to check if the Tesla driver was alive.

    • Ken

      Try reading more carefully. The movie was still playing after the accident.

      • dogphlap dogphlap

        No, the truck driver assumed from the audio he claims to have heard. It is known the Tesla driver listened to podcasts and audio books, it could have been one of those and no more dangerous than listening to the radio (or it could have been a full video plus audio DVD he was watching, we don’t really know). Fact is we don’t know all the facts yet. Still my guess is that whatever the Tesla driver was doing he was not adequately concentrated on the view through the windshield and that sadly is the reason he died, I don’t care what he was up to except that he was not monitoring his vehicles progress.
        I have a Model S and it is true that at first you do tend to be lulled into overconfidence in auto-pilot. But auto-pilot is subject to error, for instance when cresting a rise with the sun not too far above the horizon it can do strange things like dropping the speed of the car as the traffic aware cruise control drops out. Once (only once, that was enough to change my behaviour when it comes to trusting auto-pilot) the car took a violent lunge towards the oncoming traffic but as luck would have it I had my hands on the wheel so I violently turned the wheel away from the approaching vehicles. Of course if I had observed Tesla’s instructions I would not have used auto-steer on a road without a centre divider in the first place. Anyway I now make sure not to take my hands off the wheel when using auto-steer and I make sure to stay alert for problems. There is nothing like a near death experience to wake up your ideas.

        • Ken

          No, they found the player and he was watching a movie.

          I have auto-pilot too and it is clearly safer than a human driver used as intended and as long as you follow the instructions.

          And, yes, it is not a good idea to drive with on-coming traffic right next to you at this point.

          • dogphlap dogphlap

            I know they found the player, where is your source he was watching a movie. Not that it matters since he does not appear to have been watching the road.

        • J999

          Maybe the autopilot should be programmed to give the driver a near death experience every so often, just to keep them alert? I’m kind of half joking half serious here. In any situation where you’re just checking what someone else is doing or has done, which is basically the case if you’re driving with autopilot, it’s almost impossible to stay alert for very long if the person you’re checking appears not to be making any mistakes. For example, if checking over someone’s arithmetic, if they’ve done 1,000 calculations without error, you’ll find it almost impossible not to fall half asleep as you keep checking, whereas if they had introduced a deliberate error into say about every 50th calculation, that would keep you alert.

          • Mike

            One of the things we used to deal with (when I was in the RCAF until 10 years ago), know as “Human Performance factors in Military Aviation (HPMA), was how poorly humans monitor automatic systems.
            Certain routine hourly tasks were injected into the cockpit procedures to deal with this weakness.

  • Josh Marquez

    What kind of tech savvy guy uses DVD? Even if everything else may be plausible but DVD shit he might as well been looking at his pager.

    • Bob_Wallace

      Perhaps it belonged to his kid? Perhaps it was actually a laptop rather than a dedicated DVD player. Facts are not well known at this point in time.

    • John Edward Azhderian

      I prefer VCR’s and vinyl records!

    • Bob Smith

      What kind of “tech savvy” person watches Harry potter?

      • Bob_Wallace

        One who enjoys the movie?

  • Jenny Sommer

    It’s a little strange that the car was speeding autonomously.
    Even basic navigation devices warn you when you exceed the speed limit.

  • mikgigs

    How a person could “apparently speeding” in an autopilot mode? Is that some kind of joke? Autopilot is over-speeding??!? Tesla company has allowed such an apparent bug. Bullshit, otherwise this company is dead from this day on.

    • Bob_Wallace

      Julian has said that Tesla’s autopilot doesn’t operate if the car speed is more than five miles above the speed limit.

      Another driver reported that the Tesla passed them right before the wreck while they were driving 85.

      There’s some doubt about the facts here. Bad report by the witness? Auto-pilot system hacked?

      • mikgigs

        you can hack deliberately an autopilot software to allow over-speeding? Nah….people still have the instinct of self-preservation. Something fishy about this publication – sounds like preliminary gossiping in a pseudo-expertise confidence manner.

        • Bob_Wallace

          Apparently the autopilot system will allow speeding on highways.

        • Ken

          Wrong, you are just ignorant.

          Tesla restricts the speed to just 5 miles above the limit on non-divided roads but it is not restricted on divided highways like the one in question.

      • Ken

        Here is the actual info: Tesla restricts the speed to just 5 miles above the limit on non-divided roads but it is not restricted on divided highways like the one in question.

        • Bob Smith

          Then tesla consciously created a system that violates the law.

          • Bob_Wallace

            A system that does not prevent drivers from violating the law.

            The Tesla system did not decide on the car’s speed.

          • Ken

            Completely wrong again. It is the user that chooses to violate the law, not the company. The user must input the speed exactly the same way that a user chooses to violate the law by pushing the accelerator down, causing the car to speed. They are just two different ways of controlling the speed but both are 100% chosen by the driver.

            Auto-pilot must be able to drive at the same speed as the flow of traffic which is often 10-20 miles above a posted speed limit. It it can’t, the slow driving car becomes a dangerous hazard. Cars traveling too slowly have been proven to be as dangerous as speeding cars. So Tesla designed the system exactly right.

            All car companies design cars that can violate the law if the user chooses.

            Try and catch up and stop panicking about tech you clearly don’t understand.

      • Outcast_Searcher

        Read the thread please. It has been repeatedly stated by Tesla drivers that Autopilot does NOT enforce a speed limit on a divided highway. (I think this is a very bad policy by Tesla).

        • Bob_Wallace

          Read the thread please. Look at the time stamp on the comment you reference and look at my later comments which acknowledge the no-speed limit on highway bit.

        • Ken

          It is you that has not read clearly. Try and catch up.

    • mikgigs

      Moreover, there are unregulated crossed left-turns allowed in US, where the speed is 85 mph??!??! Is that another joke?

      • Bob_Wallace

        I seriously doubt the speed limit was 85. The legal maximum in Florida is 70 MPH.

      • Jeremy Friesner

        Florida is a kind of joke, yes. (To be fair, the landscape there is really flat, so at least you have good visibility most of the time. Still, making a left-hand turn across a freeway is always going to be a hazardous thing to do)

      • Kevin McKinney

        Of course not. The witness said she was doing 85, but nobody is claiming that that was under the speed limit.

        As to the existence of such roads, yes, they are ‘a thing.’ They aren’t even all that uncommon; there are several fairly near Atlanta, where I live. The speed limit is generally 55, but most drivers (I would estimate) travel at somewhere between 60-80.

    • Ken

      No, it is not a joke. You can set the speed above the limit on auto-pilot above the limit on divided roads because people regularly travel above the speed limit on these roads and it would actually be frustrating and even dangerous not to be able to keep up with the flow of traffic.

      The Tesla system has already proven to be safer than a human driver and people love auto-pilot so the company is not “dead” in any way.

      • nitpicker357

        No, it has NOT proven to be safe than a human driver. It may or may not be safer than the average human driver.

        • Joe Viocoe

          Certainly not “proven”. But the trend so far is promising. And the potential to be is certain.
          It isn’t hard to beat 30,000 deaths per year in the US.
          But yeah, we need more real world data to officially prove it.
          Which is why I favor keeping the beta. Machine learning is the fastest way to get there.

        • Ken

          Wrong. Elon Musk clearly said the data showed exactly what I said.

          Unless you can show data that shows differently, you’re just making things up.

      • Bob Smith

        Show me the amount of time the system has been used at over 30 mph.

        • Ken

          The system is mostly used on divided highways with high speed limits because that is where it is specifically recommended for use.

          You keep showing yourself to be completely ignorant of this tech.

          You should stop talking about it and actually do some research first, if you ever hope to have any credibility.

  • tychoa

    Human response time is too slow to respond in an emergency. In one test it took humans an up to 16 seconds to refocus attention, evaluate and take action. Autopilot malfunction in airplanes are too often tragic because the transition from machine control to human control is slow and subject to errors even though pilots are continuously training.

    • Jenny Sommer

      But a human driver could have seen the white truck. Apparently you really need LIDAR and more advanced sensors.
      That’s what I have heated from a friend implementing mobile eye/Nvidia hardware for Audi.
      Such accidents are what German automakers fear.

      I don’t know if the backlash would be even greater if the car involved was a non US brand…

      • Bob_Wallace

        I read one comment that the car’s radar was aimed too low to pick up the trailer and the system deferred to the camera.

        We need more info…..

        Actually I suspect the backlash will be greatest because it’s Tesla. Between the people who are brand loyal and anti-EV there are lots who will not hesitate to take a swipe at Tesla.

        • Jenny Sommer

          But imagine it was an Audi…German car killing US Navy.

          • Bob_Wallace

            How about taking your “pride of place” out of your comments.

          • Jenny Sommer

            I am merely repeating the sentiment of the German press.

          • Shane 2

            Tesla gets plenty of hate from the right wing and gasoline lovers in the US. That includes US media organisations like Fox news.

      • Anti Lord Kelvin

        Well, but in Germany, like in all Europe, trailers have lateral metallic protection under the trailer to protect cars for going under the trailer (up to45 mph if I remember accurately). In this case it would not saved the life of the driver, but it would helped the radar of the Tesla to detect the trailer instead of detecting nothing. As always in transport accidents, planes, trains or cars, fatal accidents are not the result of a single thing. Here, we have a perfect storm, with a truck turning and crossing in a highway (we don’t know for sure if the truck driver didn’t saw the car because it was coming fast or if he saw it but was impatient and counting that the car will slow down so make the crossing like I saw some trucks doing in front of me and as we can see in lot of youtube videos (mostly in Russia thanks to dash cams be mandatory by insurers) with truck deciding to cross even with car already in the intersection. The perfect storm continue with a, maybe, distracted car driver, with a speed to above the limit?, with autopilot didn’t recognising the trailer as an obstacle, with the trailer not having the lateral protective metal bars and so on and so on.

        • Bob_Wallace

          Radar doesn’t see colors. It bounces off a surface.

          In this case the radar seems to have not returned a signal from an object that was hood-height. Or the camera saw nothing and the radar info was overridden.

  • Cole Hyntermeister

    I don’t get why people are so up in arms about this. Oh no, one person died within 130 THOUSAND miles of driving and use. Insensitive? Sure. But statistically driving with autopilot on is safer than manual driving. Autopilot isn’t over, it’s not “the end of automated driving” because people die in cars MUCH more often when they’re in control. It wasn’t even the driver’s fault, it was the semi’s fault. The autopilot still malfunctioned and didn’t sense it coming, but if the semi hadn’t pulled out where it did he would still be alive. Including Google’s self driving cars, the number of crashes that are the autopilot’s fault is still less than 10. Every other crash is another human’s fault for running into it.

    Again, as insensitive as it sounds… One death per 130k miles is pretty dang good. This is (I believe) the first death involving a self driving car. I don’t get why people are so up in arms about this.

    • If people died for every 130,000 miles of driving, that would be really really bad. Like I have escaped statistical odds of death several times already.

      I believe your statistics are off by several order of magnitude.

      There have been 130 Million miles of autonomous driving in Tesla’s with so far just 1 fatality.

      • Bob_Wallace

        I think the fatality rate for US drivers is one per 90 million miles driven.

        There is not enough data for us to tell if the Tesla system is better or worse than humans. We might need 50 billion miles of Tesla autopilot miles to make a statistically meaningful statement.

        • Julian Cox

          Not really. Autopilot in background mode has logged a lot of accidents it could have prevented. It has also logged a lot of interventions that saved the day while activated.

          • Bob_Wallace

            One per 90 and one per 130 is not sufficient data, especially since we have only one “130” data point.

            You are now bringing in a different data set.

        • Jenny Sommer

          Also you would only compare the rate of accidents in highway driving.
          At least where I live people die on small roads hitting trees more often than on the Autobahn/Freeway (don’t know if this is the right translation for Schnellstraße…one below Autobahn)

          • neroden

            This crash was not on an expressway. This is a four-lane divided road, but it has intersections every hundred feet or so. Really dangerous road design.

        • Ken

          I would like to see where you are getting those numbers. The insurance industry accurately predicts the safety of cars with much less data than that.

          This is very similar to Tesla fires. The early data set from very few miles and very few cars proved quite predictive of the future. (of course Tesla improved the cars based on rare occurrences and the same will happen here)

          • Bob_Wallace

            “According to the National Highway Traffic Safety Administration, more than 90% of automobile crashes are caused by human errors such as driving too fast, as well as alcohol impairment, distraction and fatigue.”

            http://www.greencarcongress.com/2016/04/20160413-rand.html

          • Ken

            Thanks but I was actually asking about how you came to the conclusion that much more data is needed (50 billion miles) when insurance companies very successfully set rates with much less data.

          • Bob_Wallace

            We’ve got billions of miles of driver data. The rate of driver caused fatalities (aside from drunk/drowsy/texting) is about 10% of the total number of fatalities. 10% of one per 90 million miles. One per roughly 900 million miles.

            We’ve got one data point for Tesla. Was there a long honeymoon period for the Tesla system in which drivers were not trusting the system and paying more attention? We need more time and more miles before we have a reliable sample. 50 billion is just a way of saying “big number”.

          • Ken

            New cars like Tesla and new models from other car makers don’t get billions of miles of data before insurance companies can make quite accurate predictions about safety.

            I see what you are saying about people starting to over-trust the system but this actually also works as a wake-up call to those drivers as well as new ones.

            And Tesla will not be standing still in improving the system.

          • neroden

            This automobile crash also seems to have been caused by driving too fast, and distraction.

    • Julian Cox

      That was 130 MILLION miles of driving – and they died of a truck pulling directly across a highway infront of them.

      • Bob_Wallace

        (Cut the all-caps. If you want to emphasize use strong and /strong in between ))

        There’s not enough data to make a meaningful statement. Let’s say we had 50 billion miles of data that finds one Tesla fatality per 130 million miles. That could actually mean that the Tesla system is much worse than human drivers.

        The fatality rate for human drivers is one per 90 million. But 90% of human fatalities are due to drinking, distracted driving sorts of stuff. That means that the fatalities for other reasons(failure to see, bad judgement) might be in the one in 900 million mile range. At one per 130 million Tesla’s system could be a massive failure.

        • Ken

          Extremely unlikely. Since this was caused by someone flagrantly misusing the system, the system was not even at fault here. It operated exactly as set forth in the manual.

          • Bob_Wallace

            The system failed to detect a target that it should have detected.

            Tesla is not at fault, IMO, as they instructed drivers to continually observe the road.

            (There may be a small case for a bit of guilt for Tesla based on whether or not they should have released the system knowing that it was likely to be abused.)

          • Ken

            That is always a problem because people abuse virtually every system. Though, hopefully, not a lot of people.

            One of the very few positives about this very sad story is that it may serve as a wake-up call to auto-pilot users that are not following instructions.

          • Bob_Wallace

            There are a whole bunch of issues floating around here.

            I’d like to see data for all reported accidents, not just fatalities. Fatalities are too infrequent. It takes a massive amount of miles driven to make meaningful comparisons.

          • Ken

            Musk has already stated that, overall, the Tesla autopilot system has proven to be twice as safe as a human driver.

          • neroden

            We know from the history of automated trains that the system will not be accepted until it is *10* times as safe as a human driver. At least. Maybe 100 times as safe. That’s human psychology for you…

          • Bob Smith

            Sorry, the system was designed to purposely ignore speed laws. The system and originator is at fault.

          • Bob_Wallace

            No, the driver made the decision to exceed the speed limit. It was his decision alone.

          • Ken

            Wrong again. It is the user that chooses to violate the law, not the system. The user must input the speed exactly the same way that a user chooses to violate the law by pushing the accelerator down, causing the car to speed. They are just two different ways of controlling the speed but both are 100% chosen by the driver.

            Auto-pilot must be able to drive at the same speed as the flow of traffic which is often 10-20 miles above a posted speed limit. It it can’t, the slow driving car becomes a dangerous hazard. Cars traveling too slowly have been proven to be as dangerous as speeding cars. So Tesla designed the system exactly right.

            All car companies design cars with systems that can violate the law if the user chooses.

            Try and catch up and stop panicking about tech you clearly don’t understand.

      • Cole Hyntermeister

        My point exactly.

    • neroden

      Based on the evidence which has come out so far, this was 100% the Tesla driver’s fault. He was speeding, and he was not paying attention to the road.

      Autopilot or no autopilot, this is no good.

      • Cole Hyntermeister

        I mean that it wasnt the autopilot’s fault. It wasn’t like the autopilot turned or merged or didn’t do something correctly, other than sense the idiot semi merging and running into him. Speeding, sure, and obviously not paying attention is an issue. But as someone stated above, it’s one death in 130 million miles. That’s a pretty good track record, statistically speaking.

        • Bob_Wallace

          Excuse me.

          The autopilot (the collision avoidance system) failed to notice a tractor trailer moving across the highway and failed to slow/stop the vehicle.
          Some of the fault may lie with the truck driver if he did make an unsafe turn. (We don’t know if the Tesla was clearly visible.)

          Some of the fault clearly lies with the Tesla driver whose job it was to watch the road in the event that the collision avoidance system failed to detect a problem.

          Not sensing a very large truck in front of the car was a major failure for the collision avoidance system.

          • Bob Smith

            Perhaps the accident was caused solely because of excessive speed.

          • Bob_Wallace

            The accident occured because brakes were not applied in time.

            Tesla’s collision avoidance system did not stop the car.

            The driver did not stop the car. The driver was responsible for continually monitoring the collision avoidance system performance.

          • Bob_Wallace

            No, there’s a decent chance the truck driver made an unsafe turn. We’ll have to wait to see.

            The driver apparently was not watching the road. Even at high speed one should see a tractor trailer crossing the road in front of you and stop.

            It’s not like there wasn’t some warning time. The tractor part of the rig had (apparently) already cleared the lane. 18-wheelers don’t have rocket speed acceleration.

            Based on what is known at this time I think the Tesla driver owns most of the responsibility. It seems that he decided to speed, he did not keep his eyes on the road.

            Truck driver may be partly to blame. Tesla driver may be mostly to blame. Tesla collision avoidance system needs improvement.

  • Stevemj

    I have a model S and have used the self steering feature. It was provided to every owner on a month trial. In my car it would not self drive more than 5 over the speed limit. The system knows when you don’t have your hands on the wheel and reminds you. During the short time i fooled with it, it would have driven in the ditch once and run down one pedestrian if i had been watching Harry Potter. It was a $3000 option. It seemed to me that it was good enough to be dangerous.

  • lsochia

    Humans are lazy. The only reason I bike everyday is because I don’t own a car. I’m sure if I had an option of letting the bike or car take over I’d do it.

    • Outcast_Searcher

      Even if there were all kinds of warnings plastered in the car and multiple places in the manual about how unsafe that was? Even when you’re doing something potentially very dangerous — not just to you, but to others?

      I remember refusing to buy and use a can of “fix-a-flat” decades ago, after reading the warnings on the can and deciding it would be easier for them to just have a big skull and crossbones and a big “DO NOT USE THIS HORRIBLY UNSAFE PRODUCT”.

      After seeing all the warnings about Autopilot, I don’t see the point. To use it SAFELY, you may as well be driving yourself. (Not that this won’t change — but it will almost certainly take years).

      • Ken

        Wrong. Using autopilot is vastly superior and safer than driving yourself. It is easy and takes away lots of stress. It is easy to take over if you need to. That’s why it has been so popular.

        You need to try it and then you will comprehend why. Until then you probably shouldn’t be talking about something you clearly no virtually nothing about.

        • nitpicker357

          There is insufficient evidence to conclude that using autopilot is safer than driving yourself.

          Have you tried hemorrhagic fevers? They are so awesome! You can’t comprehend why until you’ve tried each of them. Until then you probably shouldn’t be doing argumentative writing, something you clearly no virtually nothing about.

  • Dylan Wentworth

    Perhaps Tesla just needs to develop self driving semis because those idiot truck drivers pull out in front of fast approaching cars all the time.

    Knowing now that the account of the Harry Potter movie playing came from the driver of the truck makes that kinda hard to believe. Even if he really did hear that playing, the impact of the crash could have turned it on. I was in a low speed wreck once that made my wristwatch somehow come off my arm,

  • John Clark

    Someone decided a while ago, to dry their wet cat in an owen. A microwave owen in fact. They sued the owen maker after their pet exploded. Guess what, they won.

    • Bob_Wallace

      I doubt the decision would have held up in appeal.

    • Shane 2

      I’m going to try this with my pet mosquito. Oh damn, my mosquito died. I loved that mosquito. I’m suing Toshiba.

    • Joe Viocoe

      That is more a problem with our legal system rather than anything to do with automotive.

  • gregben

    Perhaps Tesla should consider geo-fencing to disable autopilot when in hazardous areas. Clearly a four lane divided highway with uncontrolled access (no traffic lights, fencing, on or off-ramps) like FL-500 in Williston, FL is hazardous for both travelers on the main road and those crossing it. See picture of Holy Family Catholic Church immediately next to roadway.

    • freonpsandoz

      Why, exactly, does EVERYONE seem to assume that a human driver would have performed better in this circumstance? I think people just want to believe that they are superior to mechanical systems, even if there is no evidence to support that belief.

      • Bob_Wallace

        At the minimum a human who was looking out the windshield would have stepped on the brake pedal.

      • John Edward Azhderian

        The difference if the human driver crashed he would be at fault. If the self driving car crashed the manufacturer would be at fault and likely be sued a ridiculous amount of money (deep pockets).

        • Joe Viocoe

          That is what insurance is for… to handle that kind of litigation.

          • John Edward Azhderian

            The insurance premiums for the car industry would be sky high.

          • Joe Viocoe

            It depends on how often this happens. Eventually, with enough miles proven on the system, the statistics will show the insurance companies a lower premium.

      • Richard Frank

        There is no evidence – other than claims by Mr. Musk of 50% less chance of airbag inflation. How they get that number is unclear to me. Looking up what Wikipedia has to say about autopilot, fwiw, I notice a few things- high reliance on redundant but different software systems ( up to 5 for space craft ) and still limited true autopilot on commercial aircraft. The software is hard to create. I’m a software engineer, and have used Polyspace Code Prover ( google it ) on embedded medical device software. It is also used in the automotive field.

    • Outcast_Searcher

      What I was floored by, via investigating Youtube after reading about this incident, is thee are a lot of people driving on autopilot in areas FAR more dangerous than this — and not being very careful (or careful at all in some cases).

      I’m talking busy 4 and 6 lane undivided highways, curbs, pedestrians, traffic lights, curves, merging and exits, etc. I’m talking drivers turning to look BACK at the camera mounted in the rear of the car, while the car drives itself — on a crowded highway. I’m talking hands in laps or waving around, feet away from the pedals, and 90%+ of the drivers’ attention on the video they’re making instead of the road, because they feel “safe” with Autopilot at that time.

      If Tesla doesn’t investigate what is going on (which is easy enough to find) and take this out of the beta testing mode, or do some serious geo-fencing as per your good suggestion, they are asking for some serious problems until the system is truly fully certified road ready for full autonomy.

      • Joe Viocoe

        With all the people doing stupid stunts like this… and after 130 million miles of driving… it seems like autopilot is proving itself then.

  • Julian Cox

    Tesla Autopilot is limited to 5mph above the posted speed limit.

    The truck driver pulled out in front of a moving car going within 5mph of the legal speed limit.

    If the side of the truck was too bright for the camera, this means that the truck driver was blinded by direct sunlight in the direction of the Tesla.

    This is how the Truck Driver caused this fatal accident. It has nothing to do with Autopilot.

    • Facebook User

      I was told not to drink and drive..and to watch the road…this driver failed the second, his eyes were elsewhere. Maybe the truck driver made an error, but not driving your car yourself is not just an error, it’s illegal and fatal.

    • Bob_Wallace

      It depends on the amount of time between the truck turning and the impact.

      I’m guessing that the Tesla’s collision avoidance system did not detect the truck, the car continued to drive at high speed after having its top sheared off. Had the system detected the trailer then the brakes would have come on and stopped the car.

      The truck driver is apparently (I’m waiting for official information) partially responsible for the accident. He seems to have made an illegal turn across the lane. (But perhaps he didn’t. We don’t actually know.)

      But just because someone does something illegal does not give you the right to crash into them. The driver of the blocked car should taken evasive action. Smashed his foot down on the brake pedal, at the very least. Had the Tesla driver done his job there would be skid marks leading up to the point of impact. Stepping on the brake would have turned off cruise control and the car would have slowed to a stop rather than continuing to drive well past the point of impact.

      A very large part of the blame seems to belong to the Tesla driver. He apparently did not do his job of observing the road. It looks like he didn’t realize there was a problem until the windshield caved in.

      Does Tesla deserve some of the blame? I haven’t been able to decide that for myself.

      Tesla released an incomplete (beta) system but they warned owners that it was not ready to drive the car all by itself and drivers should continue to be alert and ready to take back control. I think that through and I assign Tesla no blame.

      But Tesla should have known, especially after seeing people do dumb stuff like sit in the back seat while their Tesla ‘auto-piloted’ that the potential for abuse was significant. Perhaps at that point they should have clawed back the system. Perhaps limiting it to stop and go, traffic jam driving conditions and gradually allowing higher speed driving as they gathered more data on their collision avoidance system. So then I lay some of the blame at Tesla’s feet.

      Here’s what I think Tesla should do right now. Limit lane keeping to low (very low probability of serious injury) speeds. Collect a hell of a lot more data on the collision avoidance system.

    • freonpsandoz

      “Tesla Autopilot is limited to 5mph above the posted speed limit.” If that’s true, the driver likely found a way around it because he had multiple speeding tickets even though he reportedly loved the Autopilot feature and used it extensively while driving.

      • Julian Cox

        You mean what if the driver defeated the Autopilot safey features somehow?

        The only possibility I can think of is deliberate failure to update the software to the latest versions that limit Autopilot speed to 5mph over.

      • Alex Walker

        Well one he probably never updated the software to 7.1v as many s model own said they would not update the software and two it says in the release notes that the speed limiter will disengage when on two lane highway.

    • Kevin McKinney

      “Tesla Autopilot is limited to 5mph above the posted speed limit.

      “The truck driver pulled out in front of a moving car going within 5mph of the legal speed limit.”

      If that is correct, then why did the witness in the other car describe the Tesla as passing her when she was doing 85, just before the accident?

      • Jenny Sommer

        What is the speed limit on that road anyways?

        • neroden

          Looks like it’s the state speed limit of 65. Way too high for the road design (really ought to have a speed limit no higher than 50), but that’s Florida for you.

      • Joe Viocoe

        I’d take software code limitations over a human witness reliant on memory, any day.

        • Kevin McKinney

          See the note above, to the effect that the speed limit function isn’t active during highway driving.

        • Outcast_Searcher

          Try reading the thread and getting the facts before you assume you know what the software code limitations are for a divided highway, since there are apparently none.

          • Joe Viocoe

            I’ve been reading far longer than you have. Check the timestamps and welcome.

      • Outcast_Searcher

        It’s not correct. For a divided highway, there is no speed limit for the AP, per multiple posts from Tesla owners in this thread.

        (I don’t agree with no speed limit under ANY circumstances any more than I agree with beta testing any driving system on public roadways with (untrained) customers).

        If someone has an emergency and really needs excessive speed for some reason, they can da*n well disengage the autopilot and do that (emergency) driving themselves. I’ve needed to do this on two lane highways, when the idiot I was passing decided to speed up a lot, for example.

    • Ken

      Actually the Tesla auto-pilot speed is only limited to 5 miles above the limit on non-divided roads. This was a divided highway.

    • neroden

      The Tesla driver who was speeding at greater than 85 mph caused the crash. As far as we can tell from the current evidence, the truck driver probably did everything right. If the Tesla had been going at the speed limit of 65 mph, it would probably not have collided with the truck at all.

  • Bob_Wallace

    This was a failure of the collision avoidance system. Seems to me that Tesla should turn off the lane-keeping feature until they sort out the solution for detecting “white trailers”.

    And they should consider limiting the use of lane-keeping to speeds at or under the speed limit.

    • freonpsandoz

      The last part is most important. As to the first, I haven’t yet seen a shred of evidence to support the unstated assumption almost everyone is making about this accident: that a human driver traveling at the same speed would have avoided the collision.

      • Bob_Wallace

        We’re operating without all the facts, but let’s continue…..

        I’ve heard one report (speculation?) that the car’s camera could not distinguish the white trailer from the bright sky. That could be true if the exposure value of the camera was set too high.

        Would the same have happened had a human been driving? I don’t think so. The human would have seen the tractor, at least the black wheels, cross the lane and the wheels of the trailer and hit the brakes.

        Would a human seeing the truck and hitting the brakes avoid the crash? We don’t know because we don’t know the length of time between the front end of the tractor entering the lane and the moment of impact. But at least the car would have slowed some and cruise control would have been turned off.

        • Jeremy Friesner

          The autopilot system has also a forward-facing radar, but it is tuned to ignore inputs above a certain height so as to avoid false positives from road signs, etc. It seems to me that they might want to tune that some more, so that it can detect any overhead object at the height of the car’s roof or lower (since be it road sign or tractor-trailer, anything at that height or lower is going to be a problem if the car tries to pass under it)

          • Dragon

            From a great distance, I don’t think radar can tell exactly how high an obstacle is. Maybe it can with enough radar beams in place and enough resolution, but the system obviously isn’t yet that capable. I suspect the car is basically casting radar straight out the nose and perfectly level to the ground and only responds with braking when it sees something significantly close because otherwise a road that curves upwards would be seen as a distant obstacle worthy of braking for.

          • Bob_Wallace

            From a great distance the difference in angle would be small and difficult to use to make a judgement. But from 300 feet away the difference in angles for bottom of trailer (4 feet) and bottom of sign (17 feet) might be sufficient.

            From 200 feet away it should be obvious that there is something in front of the car and brakes should have been applied. That might not have been soon enough to save the driver, but the car should not have continued on a long distance past the crash.

          • Dragon

            As I said, the car can and should have radar that can look up and judge the distance and height of objects, but the current hardware clearly doesn’t have that capability. We’ve now seen two cases where an autopilot car ran its windshield into a low object (the second case being while auto park was engaged), and the manual even warns that the car may not see things hanging from a ceiling during auto park. I’m sure this lack of hardware is what Elon was hinting at when he said they would need hardware upgrades to allow fully autonomous driving without human intervention. Unfortunately, some drivers are clearly treating this iteration of the system as fully autonomous and not paying attention to the road after they’ve grown to trust the car. This death will be a wake up call to autopilot users for awhile, but I suspect we’ll see this happen again eventually if Tesla doesn’t somehow track if a driver is paying attention to the road and disable autopilot if not.

        • Dragon

          According to the previous Cleantechnica article http://cleantechnica.com/2016/07/02/tesla-autopilot-fatality-timeline-facts/ “Tesla indicated in subsequent remarks that the high and wide side of the
          truck was interpreted by the system as an overhead sign.” So it saw the truck, but interpreted it as an overhead sign.

      • Outcast_Searcher

        But would (an alert) human have failed to engage the breaks at all, and travel hundreds of yards beyond the accident (if it weren’t fatal of course) crashing through fences, etc. and STILL not have been trying to stop?

        Something appears to have gone WAY off the rails here.

        • Ken

          Actually no. The driver was killed instantly so he would not have been trying to brake at all after he hit the truck. The distance the car went afterward is consistent with a car gliding to a stop from 90 miles an hour.

          The only thing that went off the rails here was a driver watching a movie instead of the road. The Tesla manual specifically states the system is not meant for that and that the TACC will not detect all objects in the road, so you have to be watching and ready to take over.

          • Dragon

            I wonder the same thing. This article says the car remained on auto pilot after going off road, auto pilot swerved to avoid some trees, then failed to swerve when it hit a power pole… All that suggests a catastrophic failure of autopilot logic, but I don’t know what evidence there is that autopilot remained active after the roof was torn off. The camera for auto pilot is located where the mirror attaches to the windshield glass. That camera would have been torn off in the crash, so I can’t imagine auto pilot could have remained active at that point.

            Some more info about autopilot tech and theories about radar blindness above a foot or two off the ground: http://jalopnik.com/does-teslas-semi-autonomous-driving-system-suffer-from-1782935594

      • Dragon

        Here’s a shred of evidence via logical deduction:
        The car struck around the middle of the semi trailer. Those trucks move very slowly while making a turn or while starting from a full stop. It would take at least 5 seconds (probably much more) for the truck to get far enough across the highway that you would hit the middle of the trailer. 5 seconds is a long time for an attentive driver to notice something gigantic moving across the highway. Therefore, no driver paying any amount of attention to the road would have had this accident. Certainly not without trying to brake or trying to swerve to avoid the trailer. At 3:40pm with the sun behind the driver, there was no way the truck was not highly visible to human vision, “bright” sky or not.

    • Julian Cox

      Bob – just while you are monitoring this topic. Note that Autopilot is limited to 5mph above the posted speed limit. The car wasn’t speeding. Not unless it was under full manual control – and apparently it was operating under Autipilot. That rules out speeding.

      While it might seem attractive to make this topic all about Autopilot, what actually happened is that a truck pulled out directly in front of a car that was operating legally and had priority over the truck. The truck was crossing a main road and was obliged to wait for onoming traffic to pass.

      Ordinary there was plenty of visbility for the truck driver to see the Tesla and to wait for it to pass before pulling out – except we are told that the truck driver was staring into sunlight sufficient that the reflection from the side of his truck was as bright as the sky.

      Yes the family should sue. They should sue the truck driver and or his insurer.

      There is nothing here to suggest that Autopilot is at fault. It might be possible to upgrade Autopilot to save the day in future. Gut feel – even if the Radar had identified this obstruction as too close to the ground to pass under, there would not have been time enough to avoid impact due to the actions of the Truck Driver pulling directly across a Highway infront a car travelling at Highway speeds.

      When this story finally settles, I suspect the simple truth is that the truck driver was blinded by sunlight and took a chance that nothing was coming – and got it fatally wrong.

      • Bob_Wallace

        I saw your previous statement about lane-keeping not operating if the speed is 5 MPG higher than the limit. I didn’t know that earlier.

        We do not know how much time elapsed between the truck entering the lane and the impact. Since the cab had cleared the lane there would have been adequate time for a human driver to take some sort of preventative action. (Not saying that there was enough time to avoid some sort of contact, we don’t have enough information.)

        Think this through. You’re driving down the road with your cruise control set no more than 5 MPG above the speed limit.

        A tractor trailer leaves the left hand turn lane on the opposite side, goes through the median, and pulls into your lane. The Sun is behind you (that was the case?). An 18-wheeler tractor is big. If you were paying attention you would see it. If you weren’t brain dead you would at least slow down (tap the brake and turn off the cruise control). And if you realized that you were likely to hit the trailer then you’d slam on the brake.

        The collision avoidance system seems to have done nothing.

        • Julian Cox

          If you saw a truck pull to the median, you have a reasonable (and legal) expectation that it will stop at the median and let you pass before proceeding across your lane.

          I think it is backwards to criticise the Autopilot. This accident was the full fault of the truck driver. The truck driver was negligent and broke the law and the Tesla had the full right to pass unmolested and without concern or need to take evasive action.

          The fact that some tweek to Autopilot might be able to save the day in future is a bonus, not a cause.

          Driver distraction by Autopilot is also a nonsense. I have travelled with Americans on long (1000 mile plus) journeys watching DVDs in a Ford F-150 and the world is full of people texting from the driver’s seat. The fact that autopilot can prevent most of these getting into trouble is basically vital.

          • Bob_Wallace

            If you see a truck pull into the median you should experience heightened alert and start observing the truck to see whether is will stop. I would expect a collision avoidance system to identify and track potential dangers.

            If you see the truck starting to move into your lane then you would slow or stop (or attempt to stop) as soon as the front bumper entered your lane. You wouldn’t wait for the tractor to drive across the lane before jamming on the brakes.

            I really don’t care whether the truck driver broke the law or not. Whichever, the truck was blocking the lane.

            If I’m going to trust my life to an autonomous car then I want the car’s system to protect me from those doing illegal things and not just legal things.

            Yesterday afternoon I had another driver abruptly pull into my lane before the rear door of their car had cleared my front wheel. I turned and hit the brake pedal hard and missed hitting the car by, literally, a couple of inches.

            (Massive horn use followed.)

            The driver was 100% at fault. I don’t want to ride in a car where it’s OK to hit that car because the other driver is doing something wrong.

          • Anti Lord Kelvin

            Sorry but didn’t this same driver who died have been “saved” from a identical problem as the one you are describing some weeks before this fatal accident?

          • Bob_Wallace

            I don’t know. And even if so I don’t see the relevance.

            Tesla’s vehicle collision system did not detect something it should have detected.

            I understand that self-driving systems may never be 100% perfect. That by letting cars do the driving might mean something like no longer playing Russian roulette with four chambers loaded and now loading only one.

            But in this case – a tractor trailer truck across the lane. That’s a target that should be detected.

          • Mike

            “I would expect a collision avoidance system to identify and track potential dangers.”

            100% agree. Akin to traffic on a cloverleaf on ramp. …..one tracks it while figuring should I move from the right lane to the center lane OR should I move out of the center lane into the left lane so the chap in the right lane can move over to allow the new vehicle to enter to road……Yada Yada yada….

      • neroden

        Julian, the evidence is coming out that the Tesla driver was speeding massively. The truck driver could have made the very reasonable assumption that the oncoming Tesla was driving at the speed limit (65 mph) and not at the 85+ mph which he appears to have been going at.

        This makes it the Tesla driver’s fault 100%.

  • Bob_Wallace

    There’s too little data to make the “30%” statement.

  • eveee

    This isn’t a blow to Teslas reputatation. It isn’t always about Tesla.
    The same thing has happened believe it or not, when people thought cruise control autopilot meant you could go to sleep or do something else.
    The point is, we are relying on tech to solve the problem, but it’s not completely a tech problem. It is well known that users must drive normally controlling the car and that the features are an assist. But the technology is too tempting to use in ways it was not yet legally sanctioned. But if it’s there, people will use it.
    The technology has the potential to save lives, but can be abused. Autopilot is used by airline pilots every day, but we still have pilots and they still control the plane on takeoff and landing and pilots must be in control during level flight at all times. Technology can do many things, but it cannot fix human behavior.
    Ultimately, we are responsible. IMO, during the adoption phase, we cannot allow autonomous driving with beta mode software. This is the car industry, not PCs. The stakes are too high. Even if the software is blameless, in automotive, the design standard is to have deep levels of safety back up and conservative design.
    We also cannot rely on simple hands on wheel detectors that won’t stop users from blocking vision. Seat belts were despised by some when new, but are now accepted daily behavior. The same struggle goes on trying to get people to use them. Technology can’t fix everything. Will we make cars drive the speed limit? Will we have it detect impaired driving? These things have potential, but in the end we have to adapt to technology as much as it adapts to our habits and behaviors. The real answer is that we must explore and examine ourselves much more. A tragedy like this is a time for us to pause, give condolences, and consider our lives and those around us.

  • Tammo

    I read where the family of the deceased is considering a law suit. This reminds me of the McDonald’s hot coffee law suit. I have to ask when is the victim responsible for their own actions?

    • Brent Jatko

      I’d say “wait until the facts come out,” but that’s apparently too long for lawyers.

    • Yeah. I don’t really think there’s a case given Tesla’s strong disclaimers to stay alert and alway be ready to take over the wheel.

      • eveee

        This the auto industry and the legal world. You have to prevent thugs that don’t make sense and in our litigious world, costs matter. Car design is an impossible job.
        The perception is that cars are safe and can prevent all danger.

        • freonpsandoz

          Thugs that don’t make sense? You mean like Lil Wayne?

          • Shane 2

            Thugs that don’t make sense? You mean like Dick Cheney?

          • eveee

            Typo

          • eveee

            See what I mean? That was auto spelling. It failed. I corrected it. Not the computer. That’s why I don’t completely trust autopilot. When computers are so good that that kind of thing never happens, then and only then, can we trust autonomous driving explicitly. You have to be able to override the system and you must be able to control it at all times. It should be an assist only, not a substitute for the driver. But how to do that so that no one defeats the system or relies on it too much?

          • Outcast_Searcher

            Seriously — the only way to do that is to release it to the general public ONLY once it’s fully capable of driving more safely than people, on its own.

            Having it tested on the public roads (only upon achieving a certain safety record in off-road testing) with testing employees who are paid and trained to take the risk and pay strict attention is risky enough for the motorists and pedestrians using public roads.

      • Tammo

        In the case of the McDonald’s lawsuit the claimant was originally awarded a few million dollars. It was mentioned that the person would have received a larger award but the jury felt that the victim was partially responsible. Based on this outcome the Tesla victim could have been holding a wild party in the car back seat and the family could still receive damages under our legal system.

        • Bob_Wallace

          Sometimes juries make bad decisions. Appeal courts have the ability to reverse outcomes.

          • Tammo

            Many times courts make bad decisions as evidence by the number people wrongfully convicted of murder when the level of proof has to be the highest.

          • Bob_Wallace

            Yes. Our system is not perfect.

            Now, let’s stay on topic.

        • eveee

          It’s also cheaper to settle and there is the matter of good will. Anyone building a vehicle is bound to be sad that their work resulted in a tragedy such as this.

        • WuestenBlitz

          Wrong. The Jury awarded the woman the equivalent of 2 days worth of coffee sales at McDonalds, equalling 2.7 milion dollars. They did so because McDonald’s had already had 700 people who had been burned by their 195 degree coffee. Even after their own safety officer suggested that 155 degrees coffee would stop people from getting 3rd degree burns within 2-7 seconds as 180+ degree coffee does. It took 700 people being scarred and disfigured before McDonalds was forced to serve their coffee at a cooler and consumable temperature. Look it up. Google the images of her burn scars and tell me if her lawsuit was frivolous.

          • Tammo

            I would suggest you rereading my post because nothing I said was “Wrong”. In addition you should read the whole Wikipedia article on the internet, Based on this article McDonalds is still serving coffee at the same temperature but has changed the warning to be a more noticeable. Please note that I am sorry the woman got burned.

      • What about, if it’s a fact, that Tesla’s auto-pilot allowed him to speed? I’m no lawyer but I could see Tesla liable for that.

        • Joe Viocoe

          It didn’t allow that. We don’t know how fast he was going.

          The autopilot doesn’t just allow overspeeding by anything more than 5 mph. Most likely, the reports of 85 mph aren’t accurate.

          • Bob_Wallace

            Looks like speed isn’t limited on highways.

          • Joe Viocoe

            thanks

        • eveee

          Yup. I am sure that is going to be debated. Unfortunately for us, we live in a society where society made me do it is an excuse. And the legal system judges percent of liability and everyone gets some. That’s the way it is. That’s why it says objects are closer than they appear on rear view mirrors. And you wondered.
          Ever see Repo Man?

        • Richard Frank

          What if you have to “speed” to override the tesla making a mistake? “Sorry Dave, I’m afraid I can’t do that.”

          • Then you are no longer in auto-pilot and Tesla would be off the hook.

      • Outcast_Searcher

        OTOH, calling the system something like “driver’s assistant” instead of “Autopilot” would be a lot more in line with the actual stated limits of the current technology.

        In the real world, how likely is it that all (or even most) drivers are going to carefully review the entire owner’s manual, and are going to catch every single warning about Autopilot?

        Since the fatal accident, I’ve watched a number of online videos of drivers using autopilot, not looking at the road, not keeping their hands anywhere near the wheel, not keeping their feet near the pedals, and acting as though the Autopilot was completely reliable.

        I saw MANY spots in several videos where the Autopilot got confused and would have caused a wreck if the driver hadn’t intervened. (Most of these were with drivers who were well aware of the system’s limitations, and were explaining that the driver had to stay alert and ready to take over).

        I still have a big problem with releasing a system in BETA to the general public driving on public roads. Driving isn’t a video game.

        • Richard Frank

          We’ve gotten used to being beta testers for everything

        • Ken

          You have been proven wrong since the auto-pilot system has already saved lives and avoided injuries.

          Luckily not everyone is as frightened about new tech as you are.

        • Alex Walker

          You forgot the video of the driver sitting in the back seat while autopilot is on or the guy caught sleeping in traffic while his tesla drove. This autopilot system needs to go away it’s only going to make bad drivers even worse and do stupid things, like watching movies when they should be paying attention and driving their vehicle.

          • Dragon

            There’s speculation that the guy caught sleeping was staged to get Youtube hits. Normally if you nod off at the wheel accidentally your head falls forward and you wake up. If he intentionally wanted to sleep he would have leaned the seat back. Instead, his head was at a pretty uncomfortable angle which made for a good photo, but not a realistic position to nod off accidentally. I could believe a guy going into the back seat was a similarly staged situation. If not, he deserves a Darwin award.

    • freonpsandoz

      My son is an attorney, and the McDonald’s case is studied in law school as a classic example of how public perceptions about a case fail to match the facts. McDonald’s had many incidents that served as warnings, and its deliberate decision to risk customers’ safety was well-documented.

      • Tammo

        McDonald having many warnings about their coffee being hot. Shouldn’t the customer be the judge of whether coffee is too hot by the customer going to another establishment when they feel the it is too hot or ask for ice in the coffee. What determines whether a coffee is too hot? I think the reason that they study the McDonald case in law school is as an example that if they can get a few million dollars for this than anything is possible.

        • Bob_Wallace

          Let’s not get way off topic here.

          The company had a record of mistakes which they had apparently not acted on. That increased their exposure.

          • Tammo

            If we are still talking about hot coffee, for most people that would not be a mistake.

        • Terence Clark

          The coffee was hot enough to cause rather severe burns to the individual in question. Much, much hotter than one would want coffee. Ever spilled your own coffee? Most people have at one time or another. Most people to my knowledge, don’t suffer second degree burns from it.

  • David

    Nominate the driver for a Darwin Award

    • freonpsandoz

      I nominate BOTH drivers.

      • Kevin McKinney

        If the Tesla driver was speeding to the extent the story suggests, there may not be much blame attached to the truck driver–who, I must point out, is ineligible for a Darwin award anyway, since he was uninjured in the crash.

        • Julian Cox

          How could the Tesla be speeding and in Autopilot at the same time? That is not how Autopilot works.

          • Kevin McKinney

            I don’t know. Had no idea that that was in the software, not being a Tesla owner myself. Yet that is what one witness said–and she copped to doing 85 herself, just before the accident.

            So, is the software infallible in this regard? How does Autopilot ‘know’ what the speed limit is? Something like Google maps, where they keep collecting and updating information on current speed limits, then run it via GPS? Is there a way to defeat or disable this feature?

            You tell me–you’re the Tesla expert. But we do seem to have a contradiction here.

          • tim clemans

            “According to the release notes for the v7.1 update, the speed limit function is only activated when the car is on “Residential roads or roads without a center divider”. So it appears that drivers will still be able to drive the car as fast as they’d like on the highway.”

          • Julian Cox

            Thanks – if that is correct and it sounds credible then of course it is my error to suggest it was impossible to exceed 5mph over on a highway and be in AP simultaneously.

        • Jonathan Cable

          Not reported here is the truck driver has been ticket 4 times in the last year. One was for failing to adhere to a traffic control device and once for improper lane change. If you look at the police released diagram you will see that the truck driver turned in front of the car. I think both the dead Tesla driver and the truck driver holds responsibility. I see truckers doing this all the time. Has happened to me 3 times so far this year. They think we’ll I am in a large truck the cars will stop for me. But when you get a speeding person not paying attention than there is a big potential for death.

  • John Cagnetta Jr

    How about this option? You drive the car yourself

    • John Clark

      How about you have this option even now?

    • freonpsandoz

      Right on. 30,000 dead motorists a year can’t be wrong.

    • leaverus

      how about this option? you stop driving altogether, grandpa

    • Bob_Wallace

      90% of all vehicle accidents are caused by drinking, speeding or distracted drivers (texting, looking at something along the road type stuff).

      It’s pretty clear that in the future it will be best if we don’t drive ourselves. The issue right now is how ready are the autonomous driving systems.

      • John Edward Azhderian

        Just take the bus and let responsible people drive their own cars. If you too stupid to drive stay home!

        • Nick

          You are smart….I can tell.

        • Joe Viocoe

          Every idiot driver thinks he or she is a responsible driver. Everyone thinks they are above average.

          • Brent Jatko

            They must be from Lake Wobegon, then.

          • Bob_Wallace

            They grew up there as children….

        • Bob_Wallace

          Having been almost killed twice by drunk drivers and hit from behind once while walking on the gravel shoulder of the road I’m going to dismiss the foolishness of your comment.

          • John Edward Azhderian

            You distrust humans too much.

          • Bob_Wallace

            With the history I’ve had I can assure you that humans have earned their reputation.

            I’ve had two almost-headons from people texting and drifting into my lane. Luckily speeds weren’t high and I had room to get on the shoulder.

            I’ve had a couple of times where a car stopped right in front of me, in moving traffic, while the owner answered their cell phone.

            I had to do an emergency swerve and brake last Friday when a car in the left lane cut in front of me before their rear door had cleared my front wheel.

            I’ve been on the shoulder several times and in the ditch a couple from people driving in the middle of a narrow two lane road while coming around a curve.

            I’ve done panic braking because someone was messing with their smart phone and pulled out in front of me after they had stopped at a stop sign.

            I trust humans to fuck up.

        • neroden

          To keep incompetent drivers at home, we will have to massively improve the driving test (make it a lot harder), and make people retake it every 5 years, and revoke licenses for people who are caught driving recklessly.

          We don’t do that right now. For some reason. If you can figure out what that reason is and how to change it, please do tell.

          • Brent Jatko

            ..as well as vastly improving mass transit.

      • Alex Walker

        Now you can add autopilot to the list bob

        • Bob_Wallace

          No, Alex. Troll elsewhere.

      • Frank

        Just for completeness, driving while sleepy is also very dangerous. Impaired is a nice catch all word.

  • JamesWimberley

    One option is to build speed limits into autonomous driving software. If you want to go faster than say 120 kph / 75 mph, which is the national speed limit in Spain, you are on your own. Perhaps you could allow the software to give audible warnings, but it would not be operating the engine, steering or brakes.

    • Brent Jatko

      Good suggestion.

    • freonpsandoz

      In the US, GPS navigation software knows the speed limit on most roads. Autopilot software in vehicles should have the same data and should use it. Speeds of 5-10 MPH above the limit could be permitted if explicitly authorized by the driver, but if the driver wants to go faster than that, disabling autopilot should be required.

      • John Edward Azhderian

        GPS in the US has issues and will lead you on occasion to wrong location. You cannot rely on technology to do all your thinking.

      • nakedChimp

        Teslas can read the speed limit signs.

      • ManOnTheHill

        And the authorization to exceed the limit is an explicit assumption of liability for any tickets or accidents proven to be caused by excessive speed (no “The autopilot did it, not me” arguments).

    • Julian Cox

      This speed limit already exists. 5mph over the posted limit is the max.

      • tim clemans

        That limit doesn’t apply in this collision case “According to the release notes for the v7.1 update, the speed limit function is only activated when the car is on “Residential roads or roads without a center divider”. So it appears that drivers will still be able to drive the car as fast as they’d like on the highway.”

        • Bob_Wallace

          Thanks, that seems to clear up some of the confusion. It would be possible for the Tesla to have passed another car that was doing 85 if the v7.1 note is correct.

          • neroden

            In the wake of this crash, there is one change Tesla should make to Autopilot. Tesla should not allow it to deliberately exceed the speed limit. If the driver wants to exceed the speed limit (or thinks that Autopilot has misread the speed limit), the driver should have to turn Autopilot off.

    • Ivor O’Connor

      F that.

    • r121

      While in Autopilot mode, I can lever the joystick to up to 90mph and in fact any faster than 80mph starts to look scary to me.

    • Alex Walker

      How about getting rid of autopilot all together, not everything in real life has to follow science fiction movies. If you’re that lazy to drive your car like a normal person then cut up your license and call a cab.

      • Frank

        Autonomous electric cars will become the cheapest most efficient cabs ever, so much so, that many won’t bother with owning their own car, and they won’t speed.

Back to Top ↑