NTSB Hearing Blames Humans, Software And Policy For Fatal Uber Robocar Crash – But Mostly Humans
The National Transportation Safety Board presented findings on the fatal crash involving citizen Elaine Herzberg and an Uber test robocar earlier this week. The board determined that the r=crash was a result of the safety driver watching a streaming video show and the impaired state of the pedestrian. At the time of the crash, Herzberg had more than ten times the medicinal dose of methamphetamine in her system, as well as marijuana residue. The safety driver, who had driven the stretch of road 73 times, had used her phone while driving to watch a video. The actions of both the automatic driving system and the vehicle’s human operator present a deeper problem of the safety culture that existed at Uber at the time of the crash.
The board did not attribute the technology failure as the principal cause of the crash as all tested self-driving vehicles have laws that could lead to a crash with a negligent driver. According to video evidence of the safety driver, she had looked down at her phone for 34% of the time during the section, including 5 seconds from 6 to 1 seconds prior to the impact. Earlier reports stated that Uber’s system was unable to classify a moving object as a pedestrian when they were outside of a crosswalk, however, this information is false. The system may have been unable to recognize Herzberg as a pedestrian because she was walking a bicycle. Uber ATG’s policies when it comes to safety culture has been controversial following the deadly crash. According to reports, they had poor framework for risk mitigation and bad oversight of vehicle operators. As of right now, Uber is performing limited testing restricted to a one-mile loop around their HQ that is limited to 25mph.