Government officials said the main cause of a fatal crash involving one of Uber’s self-driving cars in Tempe, Arizona, was the vehicle operator who failed to monitor the road and was distracted by her cell phone. Contributing to the crash was Uber’s “inadequate safety culture,” said the National Transportation Safety Board.
The ruling, announced Tuesday, was a year and a half in the making. Investigators have been working since March 2018 trying to figure out why one of Uber’s autonomous vehicles failed to detect a woman crossing the street.
Along with announcing probable cause of the crash, the NTSB also released 19 findings on the accident and a series of recommendations to make self-driving cars safer for city streets.
“Ultimately, it will be the public that accepts or rejects automated driving systems, and the testing of such systems on public roads,” says NTSB chair Robert Sumwalt. “Any company’s crash affects the public’s confidence. Anybody’s crash is everybody’s crash.”
Uber didn’t immediately return request for comment.
This story is developing…
- Uber rethinks defiance, will apply for self-driving car permit in California
- Two years until self-driving cars are on the road – is Elon Musk right?
- Self-driving cars: your complete guide to autonomous vehicles
- US outlines strong support for self-driving cars at CES
- The WIRED Guide to Self-Driving Cars
- Welcome to the future: Sony to test self-driving cars with 33 embedded sensors this year
- When Google Self-Driving Cars Are in Accidents, Humans Are to Blame
- Audi study finds high interest in self-driving cars, but far from a blanket statement
- Audi announces the world’s first ‘empathetic mobility partner’ at CES, a self-driving car nicknamed Amy that will encourage drivers to focus on self-care and tranquility instead of traffic
- If a Self-Driving Car Gets in an Accident, Who—or What—Is Liable?