WASHINGTON (Reuters) – Federal transportation safety officials headed to Las Vegas on Friday to investigate a collision this week between a self-driving shuttle bus on its first day of service and a truck, which was blamed on human error.
The U.S. National Transportation Safety Board, which has the power to issue safety recommendations and determines probable causes of crashes, wants to learn more about “how self-driving vehicles interact with their environment and the other human-driven vehicles around them,” said NTSB spokesman Christopher O‘Neil.
There have been other crashes involving self-driving vehicles but this was the first involving a self-driving vehicle operating in public service, O‘Neil said. Four NTSB investigators were expected to arrive in Las Vegas on Friday.
The Navya Arma, an autonomous and electric vehicle operated by Keolis North America, went into service on Wednesday. A few hours later, a delivery truck backed into the stopped shuttle, according to a reporter on the shuttle and one of its sponsor companies.
Las Vegas police issued the truck driver a ticket, the city government said in a blog post. The shuttle’s front end sustained minor damage, including a crumpled front fender, and resumed service on Thursday.
“The shuttle did what it was supposed to do, in that its sensors registered the truck and the shuttle stopped,” the city said.
The Automobile Association of America (AAA) of Southern Nevada, one of the shuttle’s sponsors, said it would assist the safety board’s investigation.
“Working together and sharing information will ensure this new technology is safely implemented for the public, and that’s AAA’s top priority,” the organization said in a statement.
The shuttle is also sponsored by the city of Las Vegas, Keolis North America and the Regional Transportation Commission of Southern Nevada.
Reporter Jeff Zurschmeide, who was on the shuttle at the time of the crash, said the self-driving vehicle did what it was programmed to do but not everything a human driver might do
“That’s a critical point,” Zurschmeide wrote on digitaltrends.com. “We had about 20 feet of empty street behind us (I looked), and most human drivers would have thrown the car into reverse and used some of that space to get away from the truck. Or at least leaned on the horn and made our presence harder to miss.”
The crash follows a rising number of incidents involving human drivers behaving improperly or recklessly around self-driving cars. There have been 12 crashes in California alone since Sept. 8 involving General Motors Co’s self-driving unit, Cruise Automation. All were the fault of human drivers in other vehicles, GM told regulators.
The NTSB investigated a May 2016 crash of a Tesla Inc Model S that killed a driver using the vehicle’s semi-autonomous “Autopilot” system. In September, the board recommended that auto safety regulators and automakers take steps to ensure that semi-autonomous systems were not misused.