Uber unlikely to blame for self-driving car fatality, says police chief
Following the death late Sunday of a woman who was struck and killed by a self-driving Uber car in Arizona, we now know more about the circumstances of the tragedy, which has a chance to shake public trust in autonomous car technology.

The deceased has been identified as Elaine Herzberg, 49, and according to police at a press conference Monday, she was walking her bicycle outside of the crosswalk while crossing the street in Temple, Ariz. According to a San Francisco Chronicle report Tuesday, the collision took place after Herzberg "abruptly walked from a center median into a lane of traffic."
Sylvia Moir, Temple's police chief, told the Chronicle that, having viewed videos from the car's cameras, "it's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway."
Moir went on to predict that for that reason, Uber would be unlikely to be found at fault for the crash, although she wouldn't rule out charges against the car's test driver.
Uber, however, announced Monday that it was pulling testing of self-driving cars, effective immediately.
The tragedy is the first known pedestrian fatality involving a self-driving car, although a test driver for Tesla's autopilot technology was killed in June of 2016. Uber's self-driving cars had driven 2 million miles of testing, as of December, and one million of those had been driven in the previous 100 days, Forbes reported at the time. So Uber's self-driving cars have now recorded one fatality in around 3 million miles, whereas the current U.S. rate is 1.16 people for every 100 million miles driven, according to figures reported by Wired.
Apple is ramping up its own self-driving car initiative, with up to 45 such cars on the road in California, the Financial Times reported Tuesday. It's unclear whether the fatality in Arizona will impact those plans in any way.

The deceased has been identified as Elaine Herzberg, 49, and according to police at a press conference Monday, she was walking her bicycle outside of the crosswalk while crossing the street in Temple, Ariz. According to a San Francisco Chronicle report Tuesday, the collision took place after Herzberg "abruptly walked from a center median into a lane of traffic."
Sylvia Moir, Temple's police chief, told the Chronicle that, having viewed videos from the car's cameras, "it's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway."
Moir went on to predict that for that reason, Uber would be unlikely to be found at fault for the crash, although she wouldn't rule out charges against the car's test driver.
Uber, however, announced Monday that it was pulling testing of self-driving cars, effective immediately.
The tragedy is the first known pedestrian fatality involving a self-driving car, although a test driver for Tesla's autopilot technology was killed in June of 2016. Uber's self-driving cars had driven 2 million miles of testing, as of December, and one million of those had been driven in the previous 100 days, Forbes reported at the time. So Uber's self-driving cars have now recorded one fatality in around 3 million miles, whereas the current U.S. rate is 1.16 people for every 100 million miles driven, according to figures reported by Wired.
Apple is ramping up its own self-driving car initiative, with up to 45 such cars on the road in California, the Financial Times reported Tuesday. It's unclear whether the fatality in Arizona will impact those plans in any way.
Comments
Autonomous driving is still in it's early days, which is why every company so far that has released the feature to customers has been very clear about drivers remaining alert and vigilant to the environment, ready to take over control when needed.
As Colbert used to say on his old show, it has "truthiness". The masses are inclined to believe that self-driving cars (and robots) are bad and I believe the press will play this up in the forthcoming years because it's a populist message that's easy to understand.
It also doesn't matter that 37,000 people die in traditional car crashes in the U.S. each year and it won't matter if it's proven that self-driving cars result in fewer deaths. In the public's mind, every self-driving car crash is the equivalent of at least a thousand "regular" car deaths.
Wait until the first death from a completely un-manned car. I bet a mob destroys the car.
This attitude that the car has less culpability is dangerous and naive. People made the car and made the capabilities of the car, people approved these cars for the road, and these people’s errors may result in the death of another human. This car was driving, monitored or not, and if the human driver could have avoided the fatality then the car could have too.
The entire attitude about driving our culture has is dangerous. Even the term ‘accident’ is meant to hide what really happened—car collisions are usually the result of incompetence, negligence, or arrogance. Rarely are they an innocent ‘accident’.
"Apple is ramping up its own self-driving car initiative, with up to 45 such cars on the road in California, the Financial Times reported Tuesday. It's unclear whether the fatality in Arizona will impact those plans in any way."
I blieve that it is perfectly legal in CA to "test" an autonomous car without a human driver in the vehicle. At the very least they bloodly well can put a human watchdog on them... they sure as hell have an overabundance of money to do that.
Since this story of the fatality broke, I read there WAS a human in the vehicle. So was he distracted in some way, shape or form (I'd put money that 25k+ of the traffic fatalities that happen every year stem from distracted drivers)? OTOH, if the BEST, most focused driver as dringing down the street and someone abruptly darts out, they could get hit. Far as I know, there were cameras all around this car, so we SHOJLD be able to see a video of extactly what the human did see...
That city had about 50 pedestrian deaths a year, autonomous cars amount to a fraction of a fraction of the total number of cars driving those streets, and yet it managed to already kill someone.
I do realize there was a guy in the car, and he probably was not paying attention and became complacent. I have driven for hundreds of miles in cars with brake and lane assists, and you quickly start to rely on them and not pay attention. I can only imagine how much of that effect a totally autonomous car would have on a driver.
https://www.azcentral.com/story/news/local/tempe-breaking/2018/03/19/woman-dies-fatal-hit-strikes-self-driving-uber-crossing-road-tempe/438256002/
While many accidents can't be avoided there are still a lot that are directly caused by human error. We are inherently bad drivers because we have a far worse reaction times and a lesser ability to observe and process our surroundings. There will be injury and deaths, but the goal is to reduce that number, which is inarguable with the progress of this technology.
* No autonomous vehicles were utilized.
I would like to see autonomous cars evolve to the point where they can react to situations that humans can not. That would be the real benefit of this technology.
I wonder about this, too. I'm not ready to say that it's a real (vs. a theoretical) problem, but I do wonder if there's any algorithm for weirdness. For example, when I see a person weaving about the road, I no longer think it's safe to be near them, so I either slow down or pass if I believe it's safe. Same for a human. If I see someone who's drunk or obviously not watching where they're going, I'll slow down or change lanes even if they're currently not on the road. I wonder if the self-driving cars treat these are unpredictable events, whereas most drivers would notice the odd behavior and take preemptive action.