The US National Transportation Safety Board (NTSB) has outlined the events leading up to the fatal collision between an autonomous Uber vehicle and a pedestrian on 18 March.
The board states in its preliminary report, which does not give a probable cause of the incident, that the pedestrian was “dressed in dark clothing, did not look in the direction of the vehicle until just before impact and crossed the road in a section not directly illuminated by lighting”, as well as “the pedestrian was pushing a bicycle that did not have side reflectors, and the front and rear reflectors, along with the forward headlamp, were perpendicular to the path of the oncoming vehicle.”
Tests on the pedestrian determined a positive result for methamphetamine and cannabis, and she crossed in an area where signage warned to use a designated crossing 360 feet away from the crash site.
The Volvo XC90 test vehicle that struck the pedestrian was equipped with Uber’s autonomous driving systems, which disable the car’s factory-fit automatic emergency braking function when the car is in computer control mode in order to diminish erratic behaviour while the systems were under test.
The pedestrian was detected by Uber’s systems six seconds before impact, and the car determined that emergency braking was required 1.3sec before impact. No faults or diagnostic messages were present at the time of the crash.
“The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator,” says the NTSB.
The operator used the steering wheel less than one second before impact and braked less than one second post-impact. She disclosed to NTSB that she was monitoring the self-driving interface.
An Uber spokesman has been approached for comment.
Uber shut down its autonomous vehicle testing programme in the US state of Arizona two months after the accident. Following Arizona governor Doug Ducey's decision to suspend Uber's testing rights in the state, the company has now officially halted the programme. No timeframe has been set for when testing will resume; for now, Uber's focus will be on smaller programmes in Pennsylvania and California.
Uber has come under fire as the operator of the first autonomous vehicle to have been involved in a fatal incident. In a previous letter addressed to Uber CEO Dara Khosrowshahi, Ducey described a video of the incident, in which the self-driving car crashed into a 49-year-old woman, as "disturbing and alarming" and said that it "raises many questions about the ability to continue testing in Arizona".
"As governor, my top priority is public safety," Ducey wrote. "In the best interests of the people of my state, I have directed the Arizona Department of Transportation to suspend Uber's ability to test and operate autonomous vehicles on Arizona's public roadways."
Join the debate
Add your comment
The driver was monitoring the self driving interface
Volvo
and Uber have checked the footage indepentantly, Uber admitted the Volvo systems had been disabled, Volvo ran simulations using the footage with their systems enabled, and it did detect and react to the woman. https://www.pcmag.com/news/361399/uber-car-involved-in-fatal-crash-had-braking-system-disabled
Im pretty sure i heard that
Im pretty sure i heard that in the Uber case the women apeared from behind something and walked straight out into the road and who ever or what ever was in control would have still hit her. I wonder how many people got hit by cars that day that were driven by humans? Id guess alot
d79m wrote:
Well you heard a load of shit then, so instead of believing whatever idiots told you that bollocks why didn't you have a look at the video for yourself before repeating it and making yourself look as big a buffoon as they are, and see that the woman who was killed had walked out of nothing other than a wide expanse of empty tarmac and was utterly blameless for the awful event which befell her.
That would have been a wiser choice than being the ignorant knobhead who insults the memory of an innocent dead woman.
bowsersheepdog wrote:
the report clearly states that the pedestrian walked onto the road where there were no ltraffic lights nor any pedestrian crossing, was intoxicated and despite being dark with no lighting was not wearing reflective clothing...talk about ignorance!
By the way I’m not in any way defending how the Uber car behaved as it is dumbfounding that the auto brake was disabled in an autonomous test vehicle!!! However a human driver would have not been able to brake either under similar circumstances, so I do not think this will play against the development of autonomous vehicles; au contraire such unfortunate incidents will most probably advance the protocols for running autonomous vehicles and make them more dependable.
volvocu wrote:
It isn't an offence to cross the road where there are no traffic lights or other crossing facility. Nor is it against the law for somebody who has been drinking to cross the road. Neither is walking at night without reflective clothing an illegal act. So of what exactly am I allegedly ignorant? Are you suggesting that if she has been drinking there is an entitlement to run her over? My previous assertion that the woman was crossing a wide, clear piece of road in full view to the vehicle is entirely correct, and the onus was on the driver of the vehicle, human or otherwise, to ensure the safety of pedestrians legally crossing the road it was using.