U.S. safety board to probe self-driving shuttle crash in Las Vegas

Handout photo of the first public self-driving shuttle, launched as a pilot project sponsored by AAA and Keolis is shown in downtown Las Vegas
The first public self-driving shuttle, launched as a pilot project sponsored by AAA and Keolis is shown in downtown Las Vegas, Nevada, November 10, 2017. Courtesy of AAA/Keolis/Handout via REUTERS

November 10, 2017

By David Shepardson

WASHINGTON (Reuters) – Federal transportation safety officials headed to Las Vegas on Friday to investigate a collision this week between a self-driving shuttle bus on its first day of service and a truck, which was blamed on human error.

The U.S. National Transportation Safety Board, which has the power to issue safety recommendations and determines probable causes of crashes, wants to learn more about “how self-driving vehicles interact with their environment and the other human-driven vehicles around them,” said NTSB spokesman Christopher O’Neil.

There have been other crashes involving self-driving vehicles but this was the first involving a self-driving vehicle operating in public service, O’Neil said. Four NTSB investigators were expected to arrive in Las Vegas on Friday.

The Navya Arma, an autonomous and electric vehicle operated by Keolis North America, went into service on Wednesday. A few hours later, a delivery truck backed into the stopped shuttle, according to a reporter on the shuttle and one of its sponsor companies.

Las Vegas police issued the truck driver a ticket, the city government said in a blog post. The shuttle’s front end sustained minor damage, including a crumpled front fender, and resumed service on Thursday.

“The shuttle did what it was supposed to do, in that its sensors registered the truck and the shuttle stopped,” the city said.

The Automobile Association of America (AAA) of Southern Nevada, one of the shuttle’s sponsors, said it would assist the safety board’s investigation.

“Working together and sharing information will ensure this new technology is safely implemented for the public, and that’s AAA’s top priority,” the organization said in a statement.

The shuttle is also sponsored by the city of Las Vegas, Keolis North America and the Regional Transportation Commission of Southern Nevada.

Reporter Jeff Zurschmeide, who was on the shuttle at the time of the crash, said the self-driving vehicle did what it was programmed to do but not everything a human driver might do.

“That’s a critical point,” Zurschmeide wrote on digitaltrends.com. “We had about 20 feet of empty street behind us (I looked), and most human drivers would have thrown the car into reverse and used some of that space to get away from the truck. Or at least leaned on the horn and made our presence harder to miss.”

The crash follows a rising number of incidents involving human drivers behaving improperly or recklessly around self-driving cars. There have been 12 crashes in California alone since Sept. 8 involving General Motors Co’s self-driving unit, Cruise Automation. All were the fault of human drivers in other vehicles, GM told regulators.

The NTSB investigated a May 2016 crash of a Tesla Inc Model S that killed a driver using the vehicle’s semi-autonomous “Autopilot” system. In September, the board recommended that auto safety regulators and automakers take steps to ensure that semi-autonomous systems were not misused.

(Reporting by David Shepardson; Editing by Colleen Jenkins and Frances Kerry)

42 Comments on "U.S. safety board to probe self-driving shuttle crash in Las Vegas"

  1. Michael Hawk | November 15, 2017 at 7:49 am |

    Here is a Thought . When Someone gets Hit/Possibly Killed Who do you hold LEGALY accountable ? Who gets sued ?

  2. It is good that this is being investigated. I like most am skeptical about vehicles without human drivers. But it is wrong to assume that the computer should have done this or that. It is also wrong to automatically blame something that has no voice and cannot defend itself.

  3. Attritionist | November 12, 2017 at 7:32 pm |

    So it’s the SD Car’s fault for getting hit. Let’s not get crazy folks. SD car’s are a stepping stone to future travel & automation. I for one am proud of the American enginuity involved in blazing the trail on this new tech.

  4. I wonder how long it will be before we start to hear about deaths caused by these computer driven machines.

  5. Humans cannot anticipate all the stupid things other humans can do. How can someone expect a computer to do it?

  6. Oh Boy! The lawyers are going to have a field day with all the law suits that will result in self driving vehicles. I can only say if this goes nation wide it’s going to be a mad house, but maybe that’s what they want.

  7. SendThemPacking | November 12, 2017 at 7:39 am |

    “The crash follows a rising number of incidents involving human drivers behaving improperly or recklessly around self-driving cars. There have been 12 crashes in California alone since Sept. 8 involving General Motors Co’s self-driving unit, Cruise Automation. All were the fault of human drivers in other vehicles, GM told regulators.” And when this happens with driver vs driver what happens.. I see this all the time where the other drive avoids the “stupid” drivers bad move. If the autonomous vehicles can do it “all” why can’t they seem to drive defensively.

    • The insurance companies are behind this because for decades they have worked to decriminalize the result of reckless driving so they can raise premiums to astronomical levels and make more profit.

      Holding the stupid drivers accountable under the law will seriously lower these incidents.

    • 808Americans | November 12, 2017 at 2:54 pm |

      Interesting.

      Wondering how many scam artist have made the self driving cars their next “pinch.”

  8. Self driving cars will eventually take our driving freedom away

    • 808Americans | November 12, 2017 at 3:00 pm |

      A valid opinion.

      However, I think it will be the government(s) that takes those rights away.
      Humans can not be trusted you see. (sarc)

      (Granted this may be your point)

      It will start in the cities.
      Then the states.
      Then the last grab from the Feds.

      All the while, insurance companies fighting and struggling to either stop it outright or lobbying to be the providers of all those “new” driver-less policies.

      • Driving is not a Right, it is a privilege. I remember something like that in the manual when I was learning to drive. I agree with both of you, and the slippery slope brought up by 808Americans, scares me.

  9. We are being taught to drive like a computer. It is not going to work. You cannot intermingle with a computer.

  10. What if, one of these “self-driving’ cars ran a red light, wrong lane, etc; or something that humans do all the time and there is no “safety-human” – who gets the citation and who signs it?

  11. landy fincannon | November 11, 2017 at 12:03 pm |

    Autonomous vehicles will put a lot of people out of work. They will seek revenge.

  12. The libtards are going to push to ban self driving vehicles because real humans do stupid acts and cause accidents.

  13. The only way I’ll trust a self driving car is when I’m too old and mentally gone to care.

  14. “The crash follows a rising number of incidents involving human drivers behaving improperly or recklessly around self-driving cars. There have been 12 crashes in California alone since Sept. 8 involving General Motors Co’s self-driving unit, Cruise Automation. All were the fault of human drivers in other vehicles, GM told regulators.”

    HAHAHHAHAHAHHAHAHAHAH……..12 crashes and ALL were the fault of the human drivers???
    My bloody bleeding arsehole…..what bullshti is GM trying to blow up the sphincter, here???

  15. It Was Too Busy Looking For Peterbuilt It’s Daddy!

  16. ...remain calm and return fire | November 10, 2017 at 3:31 pm |

    the geniuses claim human error has been eliminated from driver-less vehicles, but humans write the code that guides self-driving vehicles……smoke and mirrors…

    • intimeforthedime | November 10, 2017 at 7:09 pm |

      100% true words
      Instead of the blame being someone you can sue or get your damages from (the other driver).
      You will now have NO other driver to sue or get your damages from.
      The wreck and your broken neck is…….no bodies fault. But you must keep insurance up to date.
      THAT IS WHY AAA IS SPONSORING THIS THING!

    • That’s not the human error they are referring to but not many will truly understand that.

      • ...remain calm and return fire | November 11, 2017 at 5:24 am |

        these geniuses think they can solve all human error, and worse, they believe that humans are the error…except for themselves, of course…

    • That’s not the human error they are referring to but not many will truly understand that.

    • The code writers are probably from another country.

    • They also write the code that will turn them into self-driving weapons of mass destruction.

  17. What is this thing going to do when an Ambulance or Hook And Ladder comes at them with sirens blaring?
    Stop in it’s tracks, blocking the Emergency vehicle, or continue on it’s course and heading. They should not be on any road in any town until they can answer all sorts of what ifs! I don’t like seeing AAA’s logo on it either. Their insurance requires you carry a speed and event monitoring devise in your vehicle. They may have just lost this customer of 45 years!

  18. intimeforthedime | November 10, 2017 at 1:50 pm |

    The passenger on board during the wreck said it best. “most human drivers would have thrown the car into reverse and used some of that space to get away from the truck.”

    And there is our problem folks, the program will NEVER be written to do anything that puts the auto car in a risky situation where the car does anything “incorrectly”. What was the risk? Going in reverse on the street is a huge No No, so the programmers never let the car make that decision.

    So when an 18 wheeler is barreling down on that auto car, the program make the decision to do nothing and take the hit like a frog in the head lights.

    Because in the end, as long as it is not the auto car’s fault. Your dead (due to a technicality), but the auto car manufacture is safe from legal harm.

  19. Sergeant_rock | November 10, 2017 at 1:26 pm |

    Self-driving vehicles are simply “accidents” looking for a place to happen…
    Like the bumper sticker I seen today; “to error is human, to really screw it up takes a computer”

    If you trust your computer, then a self-driving car may be right for you….
    Personally, I don’t trust computers, they screw up too much and can be hacked,…

    • intimeforthedime | November 10, 2017 at 7:09 pm |

      I say it was the auto-car’s fault for not doing the prudent thing and backing the hell up to avoid the collision.

    • Me thinks they’re just more of a way to control people and enlarge government with more “safety bureaus” et als.
      Imagine you’re a government and a good majority of your people are in vehicles you can control en masse. Suddenly you decide that all those people/sheep are finally going to act out towards something you’re about to implement, what better way to get people to a destination to be handled than to lock the doors and set the vehicles navigation system to the destination of your choosing?
      Ever wonder why the FEMA camps are located where they are?

    • Me thinks they’re just more of a way to control people and enlarge government with more “safety bureaus” et als.
      Imagine you’re a government and a good majority of your people are in vehicles you can control en masse. Suddenly you decide that all those people/sheep are finally going to act out towards something you’re about to implement, what better way to get people to a destination to be handled than to lock the doors and set the vehicles navigation system to the destination of your choosing?
      Ever wonder why the FEMA camps are located where they are?

  20. wildfire1944 | November 10, 2017 at 1:08 pm |

    No driver ,, what could possibly go wrong? Oh wait ………….

    • 808Americans | November 13, 2017 at 9:34 am |

      I can see the first “hack” of the driver-less systems now.

      Tech is wonderful but to much reliance on it is a disaster.
      IMO.

  21. They *have* to blame every crash where an autonomous vehcile is involved on human error……after all, computers are infallible, right?

Comments are closed.