Uber unlikely to blame for self-driving car fatality, says police chief

Posted:
in General Discussion edited March 20
Following the death late Sunday of a woman who was struck and killed by a self-driving Uber car in Arizona, we now know more about the circumstances of the tragedy, which has a chance to shake public trust in autonomous car technology.




The deceased has been identified as Elaine Herzberg, 49, and according to police at a press conference Monday, she was walking her bicycle outside of the crosswalk while crossing the street in Temple, Ariz. According to a San Francisco Chronicle report Tuesday, the collision took place after Herzberg "abruptly walked from a center median into a lane of traffic."

Sylvia Moir, Temple's police chief, told the Chronicle that, having viewed videos from the car's cameras, "it's very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway."

Moir went on to predict that for that reason, Uber would be unlikely to be found at fault for the crash, although she wouldn't rule out charges against the car's test driver.

Uber, however, announced Monday that it was pulling testing of self-driving cars, effective immediately.

The tragedy is the first known pedestrian fatality involving a self-driving car, although a test driver for Tesla's autopilot technology was killed in June of 2016. Uber's self-driving cars had driven 2 million miles of testing, as of December, and one million of those had been driven in the previous 100 days, Forbes reported at the time. So Uber's self-driving cars have now recorded one fatality in around 3 million miles, whereas the current U.S. rate is 1.16 people for every 100 million miles driven, according to figures reported by Wired.

Apple is ramping up its own self-driving car initiative, with up to 45 such cars on the road in California, the Financial Times reported Tuesday. It's unclear whether the fatality in Arizona will impact those plans in any way.
«13

Comments

  • Reply 1 of 47
    SoliSoli Posts: 7,847member
    As much I despise this company all the preliminary evidence seems to not put Uber at fault. There is plenty of data and this is the first pedestrian death so I'm sure it will be analyzed excessively. Hopefully something good comes out of it.
    numenoreanmacxpressmuthuk_vanalingambaconstangredgeminipaanton zuykov
  • Reply 2 of 47
    ciacia Posts: 43member
    The June 2016 Tesla death was not a "test driver", it was a customer using Tesla's Autopilot feature who was NOT paying attention to the road.  A semi was turning across the street he was driving down.  Anyone looking at the road would have seen this and taken over control, but for whatever reason, this driver did not and his car went under the trailer, killing the driver.

    Autonomous driving is still in it's early days, which is why every company so far that has released the feature to customers has been very clear about drivers remaining alert and vigilant to the environment, ready to take over control when needed.
    edited March 20 DavidAlGregoryanton zuykov
  • Reply 3 of 47
    There is no compelling reason to let Uber, Apple, GM or anyone else put beta software on public streets and put lives at risk.
    georgie01NumNutsronnbaconstangivanh
  • Reply 4 of 47
    zoetmbzoetmb Posts: 2,298member
    It doesn't matter what the reality was.   It matters what the perception is and the perception of the masses is that a self-driving car killed someone.   I bet most people don't even realize that there was a person in the car.    

    As Colbert used to say on his old show, it has "truthiness".    The masses are inclined to believe that self-driving cars (and robots) are bad and I believe the press will play this up in the forthcoming years because it's a populist message that's easy to understand.  

    It also doesn't matter that 37,000 people die in traditional car crashes in the U.S. each year and it won't matter if it's proven that self-driving cars result in fewer deaths.  In the public's mind, every self-driving car crash is the equivalent of at least a thousand "regular" car deaths.  

    Wait until the first death from a completely un-manned car.   I bet a mob destroys the car.
    numenoreanlkrupp
  • Reply 5 of 47
    georgie01georgie01 Posts: 163member
    Moir went on to predict that for that reason, Uber would be unlikely to be found at fault for the crash, although she wouldn't rule out charges against the car's test driver. 
    This disturbs me. Why would Uber be ruled out but not the test driver?

    This attitude that the car has less culpability is dangerous and naive. People made the car and made the capabilities of the car, people approved these cars for the road, and these people’s errors may result in the death of another human. This car was driving, monitored or not, and if the human driver could have avoided the fatality then the car could have too.

    The entire attitude about driving our culture has is dangerous. Even the term ‘accident’ is meant to hide what really happened—car collisions are usually the result of incompetence, negligence, or arrogance. Rarely are they an innocent ‘accident’.
    edited March 20 redgeminipa
  • Reply 6 of 47
    From the article:

    "Apple is ramping up its own self-driving car initiative, with up to 45 such cars on the road in California, the Financial Times reported Tuesday. It's unclear whether the fatality in Arizona will impact those plans in any way."

    I blieve that it is perfectly legal in CA to "test" an autonomous car without a human driver in the vehicle. At the very least they bloodly well can put a human watchdog on them... they sure as hell have an overabundance of money to do that.

    Since this story of the fatality broke, I read there WAS a human in the vehicle. So was he distracted in some way, shape or form (I'd put money that 25k+ of the traffic fatalities that happen every year stem from distracted drivers)? OTOH, if the BEST, most focused driver as dringing down the street and someone abruptly darts out, they could get hit. Far as I know, there were cameras all around this car, so we SHOJLD be able to see a video of extactly what the human did see...
    numenorean
  • Reply 7 of 47
    eightzeroeightzero Posts: 2,048member
    I would be very reluctant to accept a police chief's legal opinion on liability.
    ivanh
  • Reply 8 of 47
    bloggerblogbloggerblog Posts: 1,767member
    A human would've realized that there was someone dismounting the bike and getting ready to cross. If they see that the subject was not paying attention, they may honk the horn or start to slow down. Lidar is a machine that detects objects and runs algorithms, it has no sense of empathy or human sequence prediction. The technology will be ready, but it's not ready yet.

    That city had about 50 pedestrian deaths a year, autonomous cars amount to a fraction of a fraction of the total number of cars driving those streets, and yet it managed to already kill someone.

    I do realize there was a guy in the car, and he probably was not paying attention and became complacent. I have driven for hundreds of miles in cars with brake and lane assists, and you quickly start to rely on them and not pay attention. I can only imagine how much of that effect a totally autonomous car would have on a driver.
  • Reply 9 of 47
    AI_liasAI_lias Posts: 231member
    In my opinion, there will definitely be plenty of fatalities due to autonomous cars, that's for sure, but the question will be, whether it will be just a fraction of the fatalities caused by human error currently. A lot of people die each year due to human error of drivers. I supposed this will be soon the case for airplanes also: pilotless airplanes which will not be perfect, but a lot more perfect than human limitations. 
    edited March 20
  • Reply 10 of 47
    georgie01 said: This disturbs me. Why would Uber be ruled out but not the test driver?
    Yeah, that's kind of an odd statement. I did see a story that said the human occupant had a prior felony conviction, so maybe it's just the police not giving them the benefit of the doubt right away. 

    https://www.azcentral.com/story/news/local/tempe-breaking/2018/03/19/woman-dies-fatal-hit-strikes-self-driving-uber-crossing-road-tempe/438256002/
    edited March 20
  • Reply 11 of 47
    hexclockhexclock Posts: 400member
    A human would've realized that there was someone dismounting the bike and getting ready to cross. If they see that the subject was not paying attention, they may honk the horn or start to slow down. Lidar is a machine that detects objects and runs algorithms, it has no sense of empathy or human sequence prediction. The technology will be ready, but it's not ready yet.

    That city had about 50 pedestrian deaths a year, autonomous cars amount to a fraction of a fraction of the total number of cars driving those streets, and yet it managed to already kill someone.

    I do realize there was a guy in the car, and he probably was not paying attention and became complacent. I have driven for hundreds of miles in cars with brake and lane assists, and you quickly start to rely on them and not pay attention. I can only imagine how much of that effect a totally autonomous car would have on a driver.
    Great point. A human driver, having seen people suddenly walk into the street many times, as I have, MAY have been able to anticipate this and taken some sort of preventative action. Sometimes you can see a person near the edge of the street and tell, probably subconsciously via their body language, that they intend to step off the curb. A computer would have an awful hard time making that assessment. 
    baconstang
  • Reply 12 of 47
    macxpressmacxpress Posts: 4,507member
    There is no compelling reason to let Uber, Apple, GM or anyone else put beta software on public streets and put lives at risk.
    There's a lot of people driving everyday with beta software in their brain....I see it everyday. You can't make this stuff better without real world testing. Pedestrians die everyday from being hit by cars driven by humans. It's an awful thing that happened, but in the end, it could have happened just as easily with a human driver. Like it said, it happens every single day somewhere around the world. 
    edited March 20 pscooter63
  • Reply 13 of 47
    cia said:
    The June 2016 Tesla death was not a "test driver", it was a customer using Tesla's Autopilot feature who was NOT paying attention to the road.  A semi was turning across the street he was driving down.  Anyone looking at the road would have seen this and taken over control, but for whatever reason, this driver did not and his car went under the trailer, killing the driver.

    Autonomous driving is still in it's early days, which is why every company so far that has released the feature to customers has been very clear about drivers remaining alert and vigilant to the environment, ready to take over control when needed.
    In the Tesla case, the customer was pushing the limits of the technology on YouTube videos, fancying himself a test pilot of some kind. He loved to drive as fast as possible and push the car to its limits. Would he have survived if he hadn't been watching a film instead of watching the road? Who knows, but it was his absolute negligence that led to his death. In theory, a self-driving car would never violate the speed limit and would follow the law to the letter. Humans on the other hand are a different story, but we'll have to see how the tech pans out and how it all plays out. Hopefully, these cases won't hamper the advances achieved so far. :) 
  • Reply 14 of 47
    SoliSoli Posts: 7,847member
    A human would've realized that there was someone dismounting the bike and getting ready to cross. If they see that the subject was not paying attention, they may honk the horn or start to slow down. Lidar is a machine that detects objects and runs algorithms, it has no sense of empathy or human sequence prediction. The technology will be ready, but it's not ready yet.

    That city had about 50 pedestrian deaths a year, autonomous cars amount to a fraction of a fraction of the total number of cars driving those streets, and yet it managed to already kill someone.

    I do realize there was a guy in the car, and he probably was not paying attention and became complacent. I have driven for hundreds of miles in cars with brake and lane assists, and you quickly start to rely on them and not pay attention. I can only imagine how much of that effect a totally autonomous car would have on a driver.
    You stated that human would've, not may have, and then you stated that the human probably wasn't paying attention. ¿Que?

    While many accidents can't be avoided there are still a lot that are directly caused by human error. We are inherently bad drivers because we have a far worse reaction times and a lesser ability to observe and process our surroundings. There will be injury and deaths, but the goal is to reduce that number, which is inarguable with the progress of this technology.
    edited March 20 pscooter63
  • Reply 15 of 47
    SoliSoli Posts: 7,847member
    macxpress said:
    There is no compelling reason to let Uber, Apple, GM or anyone else put beta software on public streets and put lives at risk.
    There's a lot of people driving everyday with beta software in their brain....I see it everyday. You can't make this stuff better without real world testing. Pedestrians die everyday from being hit by cars driven by humans. It's an awful thing that happened, but in the end, it could have happened just as easily with a human driver. Like it said, it happens every single day somewhere around the world. 
    There's a famously poorly design city of the future* that has much higher pedestrian deaths than normal.


    * No autonomous vehicles were utilized.

  • Reply 16 of 47
    NotsofastNotsofast Posts: 250member
    cia said:
    The June 2016 Tesla death was not a "test driver", it was a customer using Tesla's Autopilot feature who was NOT paying attention to the road.  A semi was turning across the street he was driving down.  Anyone looking at the road would have seen this and taken over control, but for whatever reason, this driver did not and his car went under the trailer, killing the driver.

    Autonomous driving is still in it's early days, which is why every company so far that has released the feature to customers has been very clear about drivers remaining alert and vigilant to the environment, ready to take over control when needed.
    No manufacturer has allowed consumers to use autonomous driving as it's not legal yet and the liability would be huge.  That driver was violating Tesla's "protocol" and to reinforce it they now have it set up that if you keep your hands off the wheel for too long the car will slow down and stop.
  • Reply 17 of 47
    robin huberrobin huber Posts: 3,184member
    One data point. This death per million miles could be an outlier. But if another one comes along in the next year it will start to look bad. 
    baconstang
  • Reply 18 of 47
    urashidurashid Posts: 47member
    A similar incident happened with a friend of mine some time ago.  A 14-year old kid riding his bike on the sidewalk suddenly jumped onto the street.  My friend's car struck the bike, killing the kid.  It was investigated by LAPD and ruled no-fault of my friend.  When he recalled the incident to me, he said the whole thing happened in a fraction of a second, before he could even register it.  Of course he was devastated and gave up driving for a while, but such things can happen to alert and attentive drivers.

    I would like to see autonomous cars evolve to the point where they can react to situations that humans can not.  That would be the real benefit of this technology.
    edited March 20
  • Reply 19 of 47
    dws-2dws-2 Posts: 182member
    A human would've realized that there was someone dismounting the bike and getting ready to cross. If they see that the subject was not paying attention, they may honk the horn or start to slow down. Lidar is a machine that detects objects and runs algorithms, it has no sense of empathy or human sequence prediction. The technology will be ready, but it's not ready yet.

    I wonder about this, too. I'm not ready to say that it's a real (vs. a theoretical) problem, but I do wonder if there's any algorithm for weirdness. For example, when I see a person weaving about the road, I no longer think it's safe to be near them, so I either slow down or pass if I believe it's safe. Same for a human. If I see someone who's drunk or obviously not watching where they're going, I'll slow down or change lanes even if they're currently not on the road. I wonder if the self-driving cars treat these are unpredictable events, whereas most drivers would notice the odd behavior and take preemptive action.
    bloggerblogbaconstang
  • Reply 20 of 47
    bloggerblogbloggerblog Posts: 1,767member
    Soli said:
    A human would've realized that there was someone dismounting the bike and getting ready to cross. If they see that the subject was not paying attention, they may honk the horn or start to slow down. Lidar is a machine that detects objects and runs algorithms, it has no sense of empathy or human sequence prediction. The technology will be ready, but it's not ready yet.

    That city had about 50 pedestrian deaths a year, autonomous cars amount to a fraction of a fraction of the total number of cars driving those streets, and yet it managed to already kill someone.

    I do realize there was a guy in the car, and he probably was not paying attention and became complacent. I have driven for hundreds of miles in cars with brake and lane assists, and you quickly start to rely on them and not pay attention. I can only imagine how much of that effect a totally autonomous car would have on a driver.
    You stated that human would've, not may have, and then you stated that the human probably wasn't paying attention. ¿Que?

    While many accidents can't be avoided there are still a lot that are directly caused by human error. We are inherently bad drivers because we have a far worse reaction times and a lesser ability to observe and process our surroundings. There will be injury and deaths, but the goal is to reduce that number, which is inarguable with the progress of this technology.
    My first instance referred to a driver in the same situation but driving a standard non-autonomous car, the second was the driver who was in the driver seat of this autonomous Uber.
Sign In or Register to comment.