Apparently Uber changed Volvo's autonomous software.
https://www.bloomberg.com/news/articles/2018-03-26/uber-disabled-volvo-suv-s-standard-safety-system-before-fatality
You watch the dash cam from this article of the moment the driver first looked up, the expression on his face is a good indicator if he was acting as the driver and not a distracted passenger he would have seen the woman. Each of you can identify with his look by remembering how much trouble mentally it is for you to ride in the passenger seat while another person drives. It does not turn off your automated muscle memory reactions from years behind the wheel paying attention as a driver. The film clips of the area after the accident shows the area well lit enough that a "driver" would have seen the woman.
A whole lot of people have egg on their face and how this is being reported to the news outlets is to CYA first. As they show more film, as experienced drivers it looks like a situation a person driving that car would not have hit the woman as the available light at the accident scene is more accurately shown post accident. There is enough ambient lighting that her "pink" bicycle frame would have caught a driver's eye along with the reflective spots on her shoes that showed up brightly in the very first dash cam video. I wondered if the dash cam video chip was not showing the full extent of the surround light. Ever noticed your smartphone chip does not see as much nighttime ambient light and resolve objects in shadow as your eyes? But a $15,000 news camera will becasue that is what $15,000 is paying for.
Wonder if NTSB will require the state police to rerun this with a person at the wheel to determine how much a person would really have seen versus that crappy dash cam video? All they will need is a person with a dark T-shirt and jeans to hold that bike frame at the victim's left hand starting spot while an officer and an observer drives past at the same speed after dark. Bet no one wants that test to happen.