Autonomous Vehicles

Published on June 30th, 2016 | by Zachary Shahan

102

1st Tesla Autopilot Fatality … After 130 Million Miles (Updates)

June 30th, 2016 by  

Update #1: The fatal accident occurred in Florida. It took the life of an Ohio man, who had actually previously been greatly helped by Autopilot in what was nearly an accident. He had published dozens of videos of Autopilot in action, and just seemed like one of those genuinely nice people who make your day a bit brighter. Our condolences to the family and friends.

Update #2: Unfortunately, many of the headlines on the major sites quickly covering this are making it seem like Tesla’s in trouble with the NHTSA because of this. Ugh. As the Tesla blog post makes clear, Tesla contacted the NHTSA. And the situation is just one of those where it seems the ending was inevitable…. Very sad. And yet again, condolences to the loved ones.

Update #3: Here is Joshua Brown’s obituary.

It is very sad to find out from Tesla Motors that the first Tesla Autopilot fatality has been logged. It comes after 130 million miles, and Tesla noted in its press release that a fatality occurs, on average, every 94 million miles in the United States, and every 60 million miles worldwide.

That puts Tesla Autopilot in a good light, but it doesn’t really lighten the mood for me in such a sad case.

It should be highlighted as well, however, that the fatality resulted from a tractor trailer crossing a divided highway and ramming into the Model S.

Bob Wallace, who just notified me of the news, adds some useful commentary:

This is an avoidable….

The important point is, IMO, once a problem like this appears, systems can be redesigned to deal with the gap and that problem should never again appear. Can’t do that with human drivers.

He also noted a point we should definitely keep in mind here: “the event frequency is so low that it would take a huge amount of driving to determine which is best. One per 130/94 million miles is not enough data to allow a meaningful statement.” And, obviously, this just looks like one of those unfortunate cases where the driver had essentially no hope anyway due to the other vehicle’s approach and specifics of the situation.

The full Tesla press release is below.


A Tragic Loss

We learned yesterday evening that NHTSA is opening a preliminary evaluation into the performance of Autopilot during a recent fatal crash that occurred in a Model S. This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.

Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred. What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.

It is important to note that Tesla disables Autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled. When drivers activate Autopilot, the acknowledgment box explains, among other things, that Autopilot “is an assist feature that requires you to keep your hands on the steering wheel at all times,” and that “you need to maintain control and responsibility for your vehicle” while using it. Additionally, every time that Autopilot is engaged, the car reminds the driver to “Always keep your hands on the wheel. Be prepared to take over at any time.” The system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.

We do this to ensure that every time the feature is used, it is used as safely as possible. As more real-world miles accumulate and the software logic accounts for increasingly rare events, the probability of injury will keep decreasing. Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert. Nonetheless, when used in conjunction with driver oversight, the data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety when compared to purely manual driving.

The customer who died in this crash had a loving family and we are beyond saddened by their loss. He was a friend to Tesla and the broader EV community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla’s mission. We would like to extend our deepest sympathies to his family and friends.


Check out our new 93-page EV report.

Join us for an upcoming Cleantech Revolution Tour conference!

Keep up to date with all the hottest cleantech news by subscribing to our (free) cleantech daily newsletter or weekly newsletter, or keep an eye on sector-specific news by getting our (also free) solar energy newsletter, electric vehicle newsletter, or wind energy newsletter.

Tags: ,


About the Author

is tryin' to help society help itself (and other species) with the power of the typed word. He spends most of his time here on CleanTechnica as its director and chief editor, but he's also the president of Important Media and the director/founder of EV Obsession, Solar Love, and Bikocity. Zach is recognized globally as a solar energy, electric car, and energy storage expert. Zach has long-term investments in TSLA, FSLR, SPWR, SEDG, & ABB — after years of covering solar and EVs, he simply has a lot of faith in these particular companies and feels like they are good cleantech companies to invest in.



  • MrL0g1c

    All semi autonomous vehicles should have proper driver attention systems like the Lexus has had since 2006. [LINK] Driver Monitoring System

  • Steven F

    Apparently the truck driver has been sited before for traffic and safety violations:
    http://abcnews.go.com/Technology/wireStory/latest-tesla-crash-harbinger-industry-fears-40271084

    “Federal Motor Carrier Safety Administration records don’t identify drivers by name, but they show that the driver for the trucking company Okemah Express was ordered off the road in January after being cited by a Virginia state inspector for being on duty more than the legal limit of 14 hours in one day.

    Okemah’s driver was also cited for failing to obey a traffic control device in March and an improper lane change in December. And an inspection last year found the truck’s tires were going bald.

    Sixty-two-year-old Frank Baressi of Palm Harbor, Florida, is the owner of Okemah Express. The company has one truck and one driver, Baressi himself. Authorities say he was at the wheel in May when the truck collided with a Tesla Model S vehicle in “autopilot” mode.”

    • Jenny Sommer

      And the driver was watching Harry Potter…

  • Neel

    This a tragic situation, RIP. This the 2nd story I have read that where the Autopilot did not detect the higher vehicle; The other was it rear ended a high truck. The sonar/detection unit needs to be adjusted for higher visibility so it can detect the higher trucks that have huge air gap space underneath. If it had the higher detection it would have definitely applied the brakes in this situation not with just Tesla but with other vehicles that now have Collision avoidance systems such as Infinity/Nissan, Hyundai…ect.

  • ROBwithaB

    The Elektrek story has a bird’s eye view of the intersection, as well as some interesting new “information”:
    http://electrek.co/2016/07/01/truck-driver-fatal-tesla-autopilot-crash-watching-movie/

  • ROBwithaB

    Perhaps we need to give some more thought to the cars sensory system, to how a car “sees” the world?

    We, as hunting, social mammals, are genetically predisposed to experience the world primarily in terms of the visible light spectrum, and we are remarkably adept at analysing such information and responding accordingly, often without even thinking about it. Our binocular vision gives us accurate depth perception and our visual cortex can process incredible amounts of data at incredible speeds.

    We think about things in a mostly visual way, our memories are composed primarily of “movies” and we even dream in pictures.
    Our ability to recognise familiar images, in very subtle detail, is probably almost unique (think facial recognition.)
    Sure, we also respond well to specific sounds (which tend to trigger, yes, images) and can even recognise familiar smells with some accuracy.
    But we are primarily slaves to our eyes.

    The car has no such limitations. It has access to an (almost) infinitely bigger slice of the electromagnetic spectrum, for a start.
    But it has none of our advantages in terms of the rapid visual processing that we take for granted, to filter out important information (snake) from “noise” (grass).

    So why are we even asking the car to “see” things?
    Even if it is using radar, why are we trying to teach it to recognise things in a visually-defined way? (Like shapes).

    Would it not be instructive to ask “What is the most effective way for a large truck to identify itself as a large truck, to a machine that has a large number of options for receiving that information?”
    If we do not limit the machine to a particular way of “seeing” things, what would be the easiest was to communicate directly with the “brain” of the machine, over a distance of perhaps 50-200m? And also over a distance of perhaps 2-50m? And what is the appropriate “language” in each case, and the cheapest and most secure way to generate the and transmit the information?

    If I’m a cyclist, I don’t know if the car can see me. I want to be able to shout “I’m a cyclist” in such a way that I know that the car will notice me and respond accordingly. Obviously, shouting won’t work particularly well. But neither will relying on my silhouette (visual or radar or lidar or whatever?) . Especially if that silhouette is constantly changing shape as I rotate the pedals, and my angle of view changes. The car would need a detailed 3-D picture of a bicycle in its memory, and would need a way to account for different sizes and designs of bicycle, and different riders. A fat dude on a recumbent bicycle has a very different silhouette to a kid on a tricycle.
    And a kid on a bicycle riding in a bike lane, parallel to the car only a foot away, is a very different type of risk to the same kid 20 feet away, on a side street, barrelling straight into the path of the vehicle.
    And the car would need to be able to do all this against a moving background
    of pedestrians, windblown grass and trees, other vehicles, advertising
    banners, neon signs, etc etc.
    All these situations are dealt with my visual brain as a matter of course, and I’m able to take appropriate action with hardly any attention or thought.
    But I have a few million years of evolution on my side. But the car cannot really rely try that approach because “evolution”, in the sense of “survival of the fittest” implies the non-survival of a large number of individuals (cars or people). Basically you only get improvement if stuff regularly goes wrong. As the current incident with the unfortunate Mr Brown attests. And society will probably not accept that particular Faustian bargain.
    We need to find a better way to communicate with cars.
    I first thought of this when Robert Llewellyn posted a video where another (blind) YouTuber acted as a guinea pig to determine the detectability of (silent) EVs. We can’t rely on blind people (or tractor trailers) to get out of the way. And we can’t rely on our cars to “see” blind people (or dogs or drunk people or tractor trailers or fat kids on bicycles) with 100% accuracy, based only on silhouettes.

    So how about… THIS?
    The blind guy has some sort of RFID (or similar device) embedded in his cane, that communicates one simple message, constantly, to any approaching vehicle: “I am a blind person”. (Call it code 1). A kid’s schoolbag would have something similar sewn into the lining, constantly shouting “Look out, there’s a kid here.” (code 2). A dog’s collar would have something similar (code 3), etc. Every bicycle, and skateboard, and car and truck and bulldozer and motorcycle and roadworker and traffic cop. Small, cheap, simple. One simple unequivocal message for each, in the “language” that is easiest for the car to understand.
    As long as the vehicle has two or more identical sensors, it can constantly triangulate the position and trajectory of the potential threat. If the car computes that “there is a child moving towards me from the left on a trajectory that is likely to cause impact within two seconds if I continue at this particular speed and direction, the car can brake, swerve, or accelerate to avoid the collision.
    As long as all the car-makers can agree on a common standard, it would be very easy to mass produce huge quantities of such small “communicators”.

    Literally billions of them. Within a year or two, every car will be able to recognise every potential risk. And while we’re about it, why not let inanimate objects communicate with the car too? Road signs do a reasonable job of alerting (attentive) drivers to impending risks, even when its wet and dark. But this requires very expensive retro-reflective vinyls, and these eventually degrade. Why not simply have a little “emitter” embedded on the sign that tells the car “from this point onwards, the recommended speed is 80km/h” or “At this point, there is a line on the road, and you are obliged to stop behind it”, or even fairly complex stuff like “yesterday, and thus according to your database, this road was 5m wide, but we are performing roadworks at the moment that restricts the width of the lane, on the left side, by a meter, for a distance of 3km. The recommended speed is 50km/h”.

    If every single relevant thing on the road identifies itself clearly and unequivocally, it would also provide a huge mentoring opportunity to refine the AI system on the car, especially with networked fleet learning. The entire fleet is trained on an ongoing basis, with clear feedback between the visual signature and the dependable digital code. Cars get much smarter much quicker, without the need for anyone to die.

    Do I know exactly HOW to do this? No, otherwise I might already have patented the the device. Do I understand enough to know that it is probably doable? Yes.
    Are there smart people who might be able to bring this idea to fruition? Probably, and this is probably a good place to find them. Or at least find people who know people who might be able to bring it to fruition.
    Or, I know the Tesla folks occasionally trawl these boards.
    So fire away….

    • john

      You’r comment about fitting “communicators” are exactly my first thoughts exactly.
      These would have a range of at least 500 meters and would allow knowledge to the system inboard, as to the state of vehicles around a blind corner.
      Scanning laser radar input would also come to mind with a minimum of 200 meters range.
      My thoughts are for the family my sympathy is expressed sincerely.

  • KublaConn

    Calling this incident an “autopilot fatality” is not a very accurate statement, which gives your headline more than a bit of a sensationalist, click-baity feel. There is a big difference between “a car that happened to have autopilot engaged was involved in a fatal accident” and “autopilot CAUSED a fatal accident” and the phrase you use in your headline “autopilot fatality” implies that the accident was CAUSED by autopilot. If you look at the details of this accident, a huge truck cut across the driver’s lane right in front of him. Plowing into the truck was pretty much inevitable due to the actions of the truck driver, autopilot or no autopilot.

    • Bob_Wallace

      Disagree. Had the system or driver seen the truck then brakes would have been applied and the trailer would not have been hit at full speed.

      In a case like this the police would likely assign responsibility for the accident to both the truck and car drivers. Both were partially responsible.

      (The two drunks who almost killed me in separate accidents both were judged 100% responsible because the police decided that I had no ability to avoid being hit.)

      • KublaConn

        If I have my cruise control engaged and then end up dying because I plowed my car into the side of a truck, no one would call it a “cruise control fatality” and imply that the accident was cause by me having cruise control engaged. Tesla’s autopilot is nothing more than beefed up cruise control with some lane detection functionality added, it is driver assistance, not fully autonomous driving functionality. I know that you are already aware of this, but it seems to need repeating. In the case of this collision, you can talk about mitigating the situation, such as you have done, but you can’t really say that the collision was unavoidable. When a large semi-trailer cuts across your path on a high speed freeway, there aren’t a whole lot of options available to you. To me, it ridiculous to try to mete out percentages of “blame” between the two drivers when there would have been no accident if the ONE driver hadn’t made a huge mistake of making a left turn where he had no right-of-way and no room to safely do so. To say that the dead driver is partially to blame because he could’ve slowed down more before striking the truck is more than a bit ridiculous. If the actions of one driver cause an unavoidable collision, then only one driver carries 100% of the blame. “He could’ve slowed down more after the truck cut across his path” or “He could’ve swerved to the left more” are meaningless statements in this case. Sure, you can try to argue if he might have escaped with major injury instead of death, but, bottom line, this is not a “autopilot fatality”, it is simply an “auto fatality”, caused by an extremely bad decision on the part of the truck driver.

        • harisA

          Cruise control is designed to keep the car at a constant velocity. If it suddenly accelerates uncontrollably and as a result kills someone, it would be considered responsible.

          It may be true that the truck driver made a bad decision. Apparently, the Autopilot was not aware of the tractor trailer presence and that is a failure mode.

          • KublaConn

            Let me repeat, AUTOPILOT IS NOT FULLY AUTONOMOUS DRIVING, it is only driving assistance, just like cruise control and lane detection are on other vehicles. It is frustrating that so many people seem to skip right over this fact. There are no fully-autonomous vehicles nor are they even legal on U.S. roads. You make my point for me, actually. In order to accurately call this an “autopilot fatality” the accident would have be CAUSED BY AUTOPILOT. Such as the vehicle suddenly accelerating on its own or swerving into another vehicle or obstacle and not giving up control of the steering wheel when the driver attempts to take control. Did any of that happen in this case? I have found no claims to that effect. The one point that seems common among most commenters is that autopilot should have automatically applied the brakes, and that would be a fair point, but that still doesn’t equate to autopilot CAUSING the accident. At worse, it equates to the driver foolishly depending too much on autopilot to operate his vehicle for him, which it is not designed to do, and when you mention the braking, you’re still just talking about mitigating the outcome of the collision, not completely avoiding the collision alltogether.

          • Bob_Wallace

            Quit all-caps shouting.
            —-

            A driver, driving defensively, should have seen the tractor trailer beginning the move out of the turning lane and across their lane. And would have slowed enough to allow for braking.

            This looks like 1) driver failure and 2) a second incidence of the Tesla system not anticipating a problem. Previously (IIRC) there was a fender bender when a bus did something that the Tesla system was not programmed to anticipate.

          • KublaConn

            It’s not “shouting” it’s “emphasizing” and I wouldn’t have to keep emphasizing the point if you didn’t keep completely glossing over the issue and ignoring it as the salient point of logic that it is. When operating a motor vehicle, it is your responsibility to do so safely, and if you fail to do so, you are responsible for the consequences. If a vehicle is approaching you, and there is not enough time & space for you to safely complete a left hand turn in front of that vehicle, but you attempt the turn anyway, then you are responsible for the consequences of that bad decision.

          • Bob_Wallace

            Open your eye-ears.

            People disagree with you. You will not shout people down on this site.

          • KublaConn

            Condescending much? Yes, we can debate & argue opinions all we want, but what we can’t debate & argue are facts. The fact is that, according to all of the information that is available to us, this collision wasn’t caused by autopilot performing some kind of dangerous maneuver or failing to relinguish control of the vehicle when necessary, therefore, it isn’t an “autopilot fatality”, which would support my original premise that the headline for this article is misleading, which then leads to me form the opinion that the headline borders on outright sensationalism that rises to the level of “click bait”. You can argue with my assessment of the headline, but there is little room for blaming autopilot for causing this collision.

          • Bob_Wallace

            Hopefully enough.

            Where does the title state that the autopilot caused the crash?

          • KublaConn

            Seriously? The headlines reads “autopilot fatality”, hence, the implication that the accident was caused by the car’s autopilot. This isn’t all that complicated, it’s fairly basic English we’re dealing with here. It’s like the headline reads “Red-headed woman robs convenience store” and, after reviewing the facts of the case, I point out that the woman was actually blonde, and you then ask “Where does the title state that she was a red-head?”

          • GCO

            I see Google’s “eggs” almost daily. As far as I know, they operate legally, at least in California.

            I agree that the driver was overconfident in the car, but for that I blame Tesla. Their description of the 2500$/3000$ autopilot option reads:

            Dramatically improve the convenience and enjoyment of road trips and your commute by automatically piloting Model S at highway speeds and in stop-and-go traffic. …

            (my emphasis)

            A major cause of this accident was the car’s inability to live up to the expectations set by its manufacturer.

        • Bob_Wallace

          It may make no sense to proportion blame to you but it does to others.

          Had the driver been doing what is expected when using the autopilot feature, watching the road, there is an excellent chance this accident would not have occurred.

          Clearly a human would have had time to react and probably avoid the crash. The Tesla hit the trailer, not the cab. It took some time for the cab to clear the Tesla’s lane. Brakes applied when the tractor entered the lane would have, at the minimum, slowed the impact.

          (And the human may well have laid down in the seat.)

          While it might have been difficult to see a white trailer against a bright sky there should have been no difficulty seeing the tractor, including its black wheels. And the black wheels of the trailer.

          Do want to argue that drivers don’t see tractor trailer rigs pulling out into their lane ahead of them? A human paying attention sees this sort of large object on the move.

          A human, driving defensively, watches vehicles that might enter their lane.

          There probably needs to be object identification built into self-driving systems.

          “That is a vehicle sitting at an angle where it could come into my lane. Track any slight movement.”

          “That is a tree. Track the vehicle.”

          • KublaConn

            And where, exactly, is the truck driver in this situation that you propose? You make absolutely no reference to his responsibility or cupability. You claim the driver could’ve avoided the collision by braking, but how do you know that the truck driver didn’t fail gave him the proper time to brake by trying to squeeze the turn in where there wasn’t adequate time nor space to do so. If you make a maneuver in your vehicle that is dependent upon another driver having to hit his brakes in order to avoid colliding with your vehicle, you have made a shitty, shitty maneuver and are 100% responsible for any collision that might occur. Yes, driving defensively is always smart and recommended, but failing to anticipate another driver making a shitty maneuver does not equate to blame for the collision that results from said shitty maneuver. In your comment, you said, “Had the driver been doing what is expected when using the autopilot feature, watching the road, there is an excellent chance this accident would not have occurred” which only supports my original statement that this wasn’t a “autopilot fatality”. If the driver was inattentive and misused the feature, then this is simply another “automobile fatality” caused by poor driving habits. In order call it an “autopilot fatality” it would have to be CAUSED by autopilot actually doing something dangerous. I know I’ve made that point before, but It seems that I need to keep bringing it up, because everyone keeps ignoring this basic logical point. This is not semantics, it is the crux of what the headline claims.

          • Bob_Wallace

            “And where, exactly, is the truck driver in this situation that you propose? You make absolutely no reference to his responsibility or cupability.”

            Oh, bullshit. Do you not understand what proportioned responsibility means?

            Now let’s talk real world reality.

            1) An 18 wheeler does not go from a stop in a turn lane to a 90 turn in seconds. An alert driver or a properly configured crash avoidance system would have the detected the cab beginning to enter the Tesla’s lane. Brakes would have been applied.

            2) Driving a large truck is difficult. In a situation like this were trucks to wait for a long break in traffic turns would never be made (a bit of exaggeration). Truck drivers have to be a bit aggressive at times. The truck driver would have had no way of knowing that the Tesla was on autopilot and the driver was watching a movie rather than the road.

            Tesla driver and truck driver both contributed to the crash.

            Now let’s address your bloomer bunching over the article title –

            “1st Tesla Autopilot Fatality … After 130 Million Miles”

            The car was a Tesla.

            The Tesla was in Autopilot mode.

            There was a fatality.

            First one, apparently, in 130 million miles of autopilot assisted driving.

            Hype? Hard to see any.

            Creative reading on your part? Possible.

          • KublaConn

            Wow, you’re really bending over backwards to try and find a way to absolve the truck driver of the consequences for his bad decision, aren’t you? What you’re actually doing though, is making my case for me. How do we know that the truck driver bothered to stop? It is just as likely that the truck driver pulled the all-too-common trucker maneuver trying to pull off a running turn in order to avoid the annoying wait that you brought up and kind of stalled out a little as he entered the target lot, as often happens with such a maneuver. So, what you’re proposing is that it is acceptable for a truck driver to make an unsafe left turn if it means he won’t have to wait? That his “aggressive” driving somehow absolves him of cupability because it saved him a little bit of time? Tell me, does this collision even happen at all if the truck driver has the patience to wait until there is adequate time & space to make his left turn? As I said, your making my point for me, thanks. Yes, driving a semi is difficult, that’s why there are schools devoted to training people how to do it properly and safely and a special test and license issued before you can legally get behind the wheel of a semi. I have relatives that earn their living driving semis and I’ve ridden along a few times, I’m not ignorant of the logistics involved in the operation of a big rig. I also know there’s a difference between a “truck driver” and a “guy who happens to be driving a truck”. As for the headline, creative reading on my part? Hardly, simply me applying the well-accepted rules of useage of the English language. If someone is killed with a gun, we call it a “gun fatality”, but if someone has a gun in their hand while walking along the road and then is killed by an automobile, even if the guy fired the gun up into the air while being hit, that doesn’t mean it’s a “gun fatality”, it is an “automobile-pedestrian fatality”. I am not being overly “creative” or “stretching” when I say that, when someone says “xxxx fatality”, the “xxxx” is supposed to be what caused the fatality. The headline says “autopilot fatality”; the collision wasn’t caused by autopilot, hence, I call bullshit on the headline for being inaccurate, and seeing as autopilot is a bit of a controversial feature to many people and Tesla is also a bit of a hotkey topic, I suggest that the author is engaging in a bit of sensationalism in order to increase the clicks on his article. You know, how he or this site possibley earns revenue?

          • Bob_Wallace

            “Wow, you’re really bending over backwards to try and find a way to absolve the truck driver of the consequences for his bad decision, aren’t you?”

            No, I most certainly am not. The truck driver seems to have turned across a lane which he could not clear before the approaching traffic arrived.

            Proportional. Look up the word. The truck driver owns part of the crash.

            The Tesla driver owns the other part of the crash.

  • ROBwithaB

    Interacted with Josh via his YouTube channel.
    An officer and a gentleman.
    Condolences to his friends and family.

  • hybridbear

    He died May 7th… How did this take so long to become news??

    • ROBwithaB

      Because, unfortunately, Tesla only saw fit to make a blog post (and Elon a tweet) once the NHTSA announced a formal investigation.

      IMHO this is bad manners and bad corporate PR. Just bad humanity all round.
      A man died, and he was a good man, by all accounts. He was also an ardent fan of the company, the vehicle, and the technology. Josh was very active on YouTube, and (rarely for YouTube) was scrupulously polite in the comments, despite the trolls and provocateurs.
      If his death is a tragedy now, and his family deserving of some solace, it was more so seven weeks ago. Keeping quiet about it for almost two months comes across like a cover up. Turning the blog post into a defense of the technology is also comes across as a bit defensive and legalistic.
      They need a better communication specialist.

      I really hope they reached out to Mr Brown’s family at the time. Legalistic concerns should not be allowed to get in the way of common decency and humanity.

      • Bob_Wallace

        From the piece Zach copied in …

        “Following our standard practice, Tesla informed NHTSA about the incident immediately after it occurred.”

        Apparently these sorts of accident investigations are lower priority for the NHTSA and they only now started their investigation.

      • eveee

        It’s a sad affair. The family might have requested privacy. I would. I don’t know the details.

        • ROBwithaB

          Sure.
          The frenzy of speculation is a little unsettling. And I’m sure Tesla would not have wanted to contribute to it.
          But a brief statement of condolence at the time would have been nice.

      • hybridbear

        I’m surprised that no media outlet covered this. Our local news always reports on automobile crash deaths & almost always includes the make/model of the car. I’m surprised no one caught that someone died driving the safest car on Earth to make the story blow up.

  • Important info, sensitively presented. Thanks, Zach.

    Based on the info presented in this article, it appears that the driver was not watching the road. Drivers have been warned of overconfidence in this nascent technology.

    • Bob_Wallace

      The NPR report this morning said the driver was watching a movie on the console screen.

      • Thanks, Bob. The driver then was not following the instructions for use of this technology. Tragic. Instructive.

        • Bob_Wallace

          People have to treat current Tesla autopilot like we have to treat “dumb” cruise control.

          Cruise control will keep your car at a constant speed but it will not stop you from driving into the rear end of a slower car.

  • Karl the brewer

    Well written, thoughtful piece here on The Motley Fool –

    http://www.fool.com/investing/2016/06/30/tesla-motors-confirms-nhtsa-investigation.aspx

    Doesn’t add to what is already known but refreshingly neutral nonetheless.

    • GCO

      “Refreshingly neutral”? The author is a Tesla shareholder.

      • Karl the brewer

        I meant ‘neutral’ as in not trying to apportion blame to any of the involved parties. But yes you are correct, ‘financially’ not very neutral at all.

  • Jenny Sommer

    Now Tesla will have to recall all cars with driving assist features and retrofit them with sensors and an annoying voice to tell you to put back your hands on the wheel or the car will slow down and park.

    • Mmmm, no.

    • Matt

      Even if they determine that they want to turn beta off for a while, it will simple be a over the air update. And when you go to turn on user will hear. “Sorry auto drive beta is not available”. No recall needed.

    • MrL0g1c

      I recall hearing about a competitors car having a system that can detect when the driver is not paying attention and reacts when this is the case.

      Some will no doubt say that Tesla is negligent for not having a good system like this.

      Quick google, it’s the Lexus which has an eye-tracking system.

      [LINK] Driver Monitoring System [LINK]

  • Brunel

    I suppose the cameras in Tesla cars could be of better quality.

    And could be used as dashcams.

  • neroden

    The street safety community refers to roads like this as “death roads”.

    Four-lane divided roads with intersections everywhere. (Also no sidewalks, crosswalks, or traffic lights in this case).

    This road seems to have a 65 mph speed limit. It is not safe to drive at 65 mph on this kind of road, thanks to the huge numbers of intersections. But it *looks deceptively safe*.

    I’m pretty much certain that the Tesla driver was driving too fast for road conditions. Unfortunately that’s probably normal on this road.

    It’s also possible that the truck driver made a turn when the Tesla was a bit too close. But he’s got to turn *somehow* and can’t wait for every car he sees out of the corner of his eye in the distance. When I have waited properly and cautiously to turn across a road like this, I’ve sometimes waited for 10-15 minutes to find a break in traffic.

    I hate roads like this. They’re dangerous by design.

    • JamesWimberley

      Yes. Drivers feel they are on a limited-access highway when they are not. The main N10 road south from Bordeaux to Biarritz and Spain used to be like this, with enormous logging trucks driving across from side roads. It’s been upgraded to proper limited access now.

    • ROBwithaB

      Yep.
      I have travel a road like this back and forth to the office every day, for the past decade and a half. (Except this one is probably a lot curvier and hillier than anything in Florida.)
      I have seen a lot of mangled steel, and many dead people.

      The “accidents” are entirely predictable and entirely preventable.
      Road design is the taboo subject when governments talk about traffic safety. Let’s just fine people for “speeding”…

      • MrL0g1c

        Road design is the taboo subject when governments talk about traffic safety.

        Doesn’t seem to be the case in the UK and the highways agency typically do a lot to design out dangerous roads*

        *For motor vehicles but not for cyclists, pedestrians etc.

    • newnodm

      I’ve seen semis turn on roads like this with the expectation of slowing/stopping traffic. If the Tesla hit the the trailer well aft of the cab, a driver paying attention would have time to react.
      Note the Tesla had daytime running lights, and the truck driver was apparently looking away from the sun. He likely saw the car.

      • JonathanMaddox

        He said he saw the car. He also said he saw the car change lanes, as though *following* the trailer not driving around the back of it. Sounds to me like the car’s sensors may have confused the cab and rear wheels for two separate vehicles.

  • mikgigs

    How you could be hit perpendicularly on a devided highway??!? Could somebody explain the trajectory of the crash?

    • neroden

      This road has intersections every couple of hundred feet.

      It’s dangerous road design.

      • Matt

        Yes, it is divided so limits head on interaction. But you still have to watch for cross traffic. It is not a controlled access road.

    • Steven F

      The truck driver pulled into a left turn lane. When he took his turn the Tesla hit and plowed under the trailer. The top of the car was sheared off. In Europe and many other countries truck are required to have bumpers to prevent a car from going under the trailer. However that is not the case in the US.

      • mikgigs

        but then it is not a divided highway with permitted high speed. What understood from you, it is a normal road, where left turn is allowed. am i right?? What is the permitted speed on such roads?Sounds really peculiar accident…

        • Bob_Wallace

          I watched a guy get killed on a highway like this. Four lanes divided by a wide grassy median. But cross roads with stop signs.

          He was directly in front of me. He stopped and the when traffic cleared he drove to the median and stopped (as marked). But for some reason he started up and pulled out in front of a car coming from his passenger’s side. Was T-boned.

          • AaronD12

            I, too, have witnessed such an accident. A truck not paying attention to crossing traffic made a car and its passengers into convertibles. It’s not a sight I will ever forget.

          • ROBwithaB

            Sorry.
            I know the feeling.
            Some stuff cannot be unseen.

          • Adrian

            In “modern” cars with stout A-pillars, it is very easy to miss seeing cross-traffic vehicles approaching from the side. I have very nearly pulled out in front of motorcycles, cars and even a semi truck, and had my wife shout “stop”when approaching a crosswalk once, when a pedestrian was completely hidden by the A-pillar. Solving one problem (poor rollover performance), has created another (bad outward visibility).

            I take a good long look from two angles before pulling out these days…

          • eveee

            I watched a motorcyclist slam into the side of an i3 that decided to leave a stopped left hand lane, cross the fast lane, and go over the median making a highly illegal U turn. I never want to see that again and I wish I could erase that from my memory forever, but I can’t. He couldn’t have been more than 20 ft away. Sickening.

          • Bob_Wallace

            My story is kind of long, this was my first day back at driving after almost being killed in my first serious drunk driver caused crash. Forty years later I have a vivid image of the driver being ejected out his door, flying in a prone position, and slamming into another vehicle. I’ll leave out the splatter details.

            I went into shock (literally). I remember being frozen in place, cars honking and pulling around me. Being escorted to the side of the road until I recovered.

          • eveee

            Yes. Accidents do that to people. Shock. Even if you are not the one. It’s like war. I am real careful now. Trying not to be so affected that it immobilizes me. It’s a reminder not to take unnecessary chances. And keep an eye out for people who are.

          • JonathanMaddox

            That’s the sort of comment one can’t upvote. Sorry this happened, and thank you for sharing the story.

        • Steven F

          In the US there is a distinction between interstate and Highway. All interstates are divided roads without intersections, and no stop lights or stop signs. Only on ramps or off ramps with cross trafic on overpasses or underpass Interstates trave east west or north south and typically cross multiple states. Speed limits are 55 and up to 70 or 75 in many places.

          I highway is allowed to have intersections, traffic lights, and left turns. It may be divided or not. They also travel generally in north south or east west directions. speed limits in towns are generally 25 to 45 MPH. In Rural area they may be as fast as an interstate. From what I have read elsewhere that stretch of highway had a 65MPH speed limit.

          Highways were build decade before the interstates were built.

      • Matt

        Even if the truck just had the air skirts (for improved gas mileage) either the TESLA or the driver might have “seen”. Story doesn’t say who was faulted. If the truck was making a left and was hit at high speed, my guess (based on no real information) would be the truck driver.

        • harisA

          If the Tesla was going faster than the posted limit then it is its fault as the Truck would have cleared the intersection.

          The airbag detonation sensor will have that information and may become public.

          • KublaConn

            You are incorrect. The driver of the truck is responsible for accurately assessing the traffic around him and the speed they are actually going to determine if he has the time & space to safely perform his left turn. The posted speed limit has nothing to do with that judgement and exceeding the speed limit doesn’t just magically absolve someone of any cupablitiy for failing to properly assess the approach speed of oncoming vehicles. If any maneuver you make in your vehicle is dependent upon the other driver having to hit his brakes to avoid colliding with your vehicle, you have made a shitty, shitty maneuver.

          • Mike

            Agreed. But…..that truck driver has probably executed the same type of move where “the other guy will slow down because I’m a big truck” many times in the past with no ill effects. So he wouldn’t expect ill effects this time around either.

          • KublaConn

            So, in other words, the truck driver may have pulled off unsafe turns in the past, but this time, there were consequences for his bad decision. When you play Russian roulette, after seven clicks of the hammer with your brains still intact, it’s not the gun’s fault when the eighth click results in your brains being splattered all over the wall, it’s yours for being foolish enough to keep taking the risk.

          • Steven F

            ” If any maneuver you make in your vehicle is dependent upon the other driver having to hit his brakes to avoid colliding with your vehicle, you have made a shitty, shitty maneuver.”

            Sadely I have seen this behavior in truck drivers a lot. I have seen truce role up to a left hand turn stop sign and quickly look both ways and then continue coasting through. Other times they make the turn at 5 to 10 miles per hour. In some cases the light turned read while they tried to accelerate their overloaded truck. Other times I have seen a truck not stop on a yellow light and entered the intersection on a red light. I then caught up with him at the next light. And then on the third a yellow, I stopped while he went through on the read.

            Some truck drivers do some very dangerous things at intersections because they don’t want to stop.

          • Steven F

            The tesla autopilot read road signs. When it is on it always drives at the posted speed limit. The tesla was not speeding.

          • harisA

            A valid point!

          • Bob_Wallace

            There are conflicting claims. Apparently on divided roads autopilot allows speeding up to about 90 MPH.

            Look through the comments here and on the other autopilot threads and you should find the copy from Tesla instructions.

      • ROBwithaB

        I’ve always wondered why side impact barriers for large trailers are not mandatory everywhere. And side skirts.
        So cheap and easy to do.

      • eveee

        Not exactly. Trailers are mandated to have bumpers that are lower, but only in the rear, not on the sides. It’s called the Jane Mansfield because that’s how she tragically died.
        It must have hit the truck underneath the trailer as described. That’s the only way the top could have been sheered in collision.

    • Brunel

      Yep. We really need news channels to do CGI animations again to explain road accidents.

      I suppose this is why Tomo News is so popular.

  • Ninjaneerd

    Very sad news. My immediate reaction (without educating myself beyond this article) is that this accident wouldn’t have occurred if the tractor trailer had autopilot).

    • Bob_Wallace

      We should reach a point later on where all vehicles in an area communicate with each other. But it may take 20 to 30 years to flush out all the ‘dumb’ machines.

      • Ninjaneerd

        Yup! I really like the idea of cars talking to each other about what they see individually so as a sum they can react to situations a human on their own would never know.

        Getting rid of all meat based processors would be nice too.

        • ROBwithaB

          Some of the meat based processors still function pretty effectively. Evolution is a very effective process for refining biological systems.
          In my case, about a million km without an incident, let alone an injury. Touch wood 🙂

          • Ninjaneerd

            Totally agree! I’m not one to claim I’m always better than the average driver either like the majority of the population believes (which obviously can’t be true). I’m am certainly prone to distracted driving and occasionally sleepy driving. During those times especially I would benefit from autonomous driving, plus everyone on the road around me would as well.

            Evolution has given the modern human meat processor 200,000 years to hone the skills (hand-eye coordination, trajectory calculation, etc.) to drive. Unfortunately evolution doesn’t work fast enough to optimize it now and we kill 30-40 thousand people per year in the US alone. Unlike evolution, the entire Tesla fleet can learn from this almost instantly.

          • JonathanMaddox

            True, but it’s not evolution that sat us on top of wheels at 100km/h.

    • Yeah… that didn’t even cross my mind and I haven’t seen anyone else bring it up.

    • Or the trailer had sideskirts. These are common here to prevent cyclists from ending up under the wheels. Happens regularly when a truck takes a right turn and fails to see the cyclist on the adjacent bicycle lane.

      • ROBwithaB

        Yes.
        Side skirts.
        And side impact bars.
        Common in many parts of the world.

        Sometimes it is impossible to avoid an impact, for whatever reason. To have a fighting chance, you want to give the crumple zones a chance to do their work, rather than expecting the roof pillars to absorb all that energy.

      • Ninjaneerd

        Interesting, I don’t think I’ve ever seen a sideskirt (or noticed?). Are they rigid?

        • Robert Engle

          Semi-rigid. On highway driving they redirect airflow.

        • Steven F

          Many sideskirts are strictly to reduce air friction an are simply thine fiberglass that will not stop a car. In other cases they are designed to stop cars like a bumper. In the US side trailer bumpers are not required. As a result thin weak fiberglass panels are typically used.

          • nakedChimp

            Even these would have worked and reflected the radar of the Tesla..

  • JR250

    Very sad news indeed. Let’s hope that the incident is put in perspective: autopilot was not the cause of the accident, it just couldn’t avoid it.

    • Bob_Wallace

      Not the cause, but there is a hole in the system.

      Unfortunately there are some situations so strange that the system designers shouldn’t be expected to imagine them.

      I wonder why the system did not detect the cab and wheels. And I wonder what sensing systems the car uses. Is there no radar? (I suppose there’s even a chance of system failure. We’ll have to wait for the investigation.)

      • Steven F

        The car has camers for lane detection and sign reading. A GPS. and a radar. The radar is located low on the front of the car. The radar might have seen the gap under the trailer and concluded it was safe. Or the radar signal may have bounced away from the car instead of bouncing back to the car. If the radar signal doesn’t bounce back to the car, the radar doesn’t see anything.

        • JamesWimberley

          Google’s pilot self- driving cars have a very capable radar. But it’s much more expensive: IIRC around $10,000.

          • Bob_Wallace

            There seem to be much cheaper solutions.

            “Velodyne LiDAR introduced the ULTRA Puck VLP-32A, combining best-in-class 32-channel performance with a small form factor and the high reliability, at the the 2016 SAE World Congress in Detroit. The ULTRA Puck VLP-32A is the company’s most advanced LiDAR sensor to date, delivering high performance at a cost-effective price point of around $500 at automotive-scale production.

            The ULTRA Puck doubles the range and resolution (via number of laser channels) of its predecessor to 200 meters and 32 channels, providing enhanced resolution to identify objects easily. The 32 channels in the ULTRA Puck are deployed over a vertical field of view of 28° and are configured in a unique pattern to provide improved resolution in the horizon to be even more useful for automotive applications. By contrast, the earlier unit used equidistant, 2˚ spacing of the channels.

            In keeping with Velodyne’s patented 360° surround view, the ULTRA Puck is able to see all things all the time to provide real-time 3D data to enable autonomous vehicles. What separates the ULTRA Puck from its predecessors and other LiDAR sensors on the market is a design created expressly for the exacting requirements of the automotive industry.”

            http://www.greencarcongress.com/2016/04/velodyne-lidar-introduces-32-channel-ultra-puck-vlp-32a-high-definition-real-time-3d-lidar-.html

            http://www.greencarcongress.com/2016/04/20160413-rand.html

            “As part of its LiDAR sensor development, Ford has tested Fusion Hybrid autonomous research vehicles in complete darkness without headlights on desert roads, demonstrating the capability to perform beyond the limits of human drivers.

            Driving in pitch black at Ford Arizona Proving Ground marks the next step on the company’s efforts to delivering fully autonomous vehicles. The development shows that even without cameras, which rely on light, Ford’s LiDAR (units from Velodyne), working with the car’s virtual driver software, is robust enough to steer flawlessly around winding roads. While it’s ideal to have all three modes of sensors—radar, cameras and LiDAR—the latter can function independently on roads without stoplights.

            National Highway Traffic Safety Administration data has found the passenger vehicle occupant fatality rate during dark hours to be about three times higher than the daytime rate.”

            http://www.greencarcongress.com/2016/04/20160411-ford.html

          • Mike

            Thanks for these links.

        • Yeah, and the biggest issue is that the system has to be precise enough to distinguish between a (low) bridge and a (high) trailer. An autopilot slamming on the brakes for no good reason is a danger too.

          • Bob_Wallace

            It’s possible to build in object identification. Inexpensive digital cameras can automatically focus on faces in a scene and wait to shoot until the subject is smiling. (And eyes are open?)

            The Tesla system needs to be able to distinguish between objects capable of movement and not capable (extremely unlikely to move, like a building).

            A disagreement between visual/camera input and radar input should put the car into a defensive mode, slowing enough to allow emergency braking (plus a safety factor) and alerting the driver. If the driver fails to give an ‘all clear’ signal then the car should slow further or stop.

            Future self-driving cars should have two features:

            1) A panic button that would allow passengers to stop the car.

            2) An “I’m confused” mode in which the car stops on the side of the road until the confusion is cleared up.

            The confusion might be cleared up by info from surrounding cars or by a human who could do a visual inspection via multiple car cameras. Or eventually by a central computer that could grab info from all cars in the area and “look” at the problem from different angles. A central control computer could store huge numbers of object examples.

      • Julian Cox

        I feel badly for Elon. It is quite a responsibility to have potentially been in a position to have done something about this one incident that quixotically does not fall so hard on those that have despite hundreds of thousands of deaths put themselves in a position to do nothing.

        Apparently the radar saw this truck but could not cooberate it with the camera because the side of the truck was white and bright as the sky.

        Because the road was clear underneath the obstacle, the radar took it for an overhead road sign. The system is designed to avoid triggering emergency braking for overhead road signs.

        The driver apparently did not respond defensively to any of what was going on outside the windscreen and apparently the truck driver drove across a road in a way that required oncoming traffic on that road to brake to survive. Autopilot is limited to within 5 mph of the road speed limit. Unless there was some other road signal that the car driver missed, this would appear to have been the fault of the truck driver – driving infront of car travelling at road legal speeds and forcing it to brake or else.

        It will not be a major adjustment to the radar system to eliminate this in future, simply by causing the radar to assess the height of overhead objects from the ground.

        As Bob pointed out. The ability to make some sense of this loss in terms of improved safety for future road users exists with Autopilot in a way that is not possible in the absence of Autopilot.

      • Bob_Wallace

        Here’s an article which discusses the problem of finding the potential “holes”.

        “According to the National Highway Traffic Safety Administration, more than 90% of automobile crashes are caused by human errors such as driving too fast, as well as alcohol impairment, distraction and fatigue. Autonomous vehicles are never drunk, distracted or tired; these factors are involved in 41%, 10% and 2.5% of all fatal crashes, respectively.”

        “In 2013, there were 2.3 million injuries reported, which is a failure rate of 77 injuries per 100 million miles driven. The related 32,719 fatalities correspond to a failure rate of about 1 fatality per 100 million miles driven.”

        Take away the drunk/speeding/distracted accidents and the rate of “other cause” accidents is very low.

        “Autonomous vehicles would have to be driven hundreds of millions of miles and, under some scenarios, hundreds of billions of miles to create enough data to clearly demonstrate their safety, according to a new report from RAND, a non-profit research organization.”

        http://www.greencarcongress.com/2016/04/20160413-rand.html

        Self-driving cars might never be 100% safe. But by eliminating the drunk/speeding/distracted stuff we should be able to drop the rate of accidents by 10x. That is an incredible improvement.

        Now the task is to whittle down the last 10% and approach zero.

  • phineasjw

    Horrible news. The man who died was a Navy SEAL and Tesla enthusiast who had previously posted a number of autopilot videos to YouTube.

    The accident happened in Florida, on May 7th. https://www.levyjournalonline.com/police-beat.html

    • Thanks. Yes, very sad. I linked to his YouTube channel and related stories as well in the update. Just seemed like a really nice guy.

      • Jenny Sommer

        Wasn’t that the same guy that poster the last video with the truck .

        It could also mean that the probability of getting killed when using the driving assist feature regularly is quite high.

        • Yep. You can go straight to his YouTube page via the link in my update. Published over a dozen Autopilot videos.

Back to Top ↑