Uber unlikely to blame for self-driving car fatality, says police chief

2

Comments

  • Reply 21 of 47
    hexclockhexclock Posts: 1,256member
    urashid said:
    A similar incident happened with a friend of mine some time ago.  A 14-year old kid riding his bike on the sidewalk suddenly jumped onto the street.  My friend's car struck the bike, killing the kid.  It was investigated by LAPD and ruled no-fault of my friend.  When he recalled the incident to me, he said the whole thing happened in a fraction of a second, before he could even register it.  Of course he was devastated and gave up driving for a while, but such things can happen to alert and attentive drivers.

    I would like to see autonomous cars evolve to the point where they can react to situations that humans can not.  That would be the real benefit of this technology.
    Sorry to hear that about your friend and the kid. Even if the computer can react instantly, the vehicle still carries a great deal of momentum. The car may in fact swerve to avoid the kid and then roll over. Many factors to consider. 
  • Reply 22 of 47
    berndogberndog Posts: 90member
    Release the video already!
  • Reply 23 of 47
    SpamSandwichSpamSandwich Posts: 33,407member
    There is no compelling reason to let Uber, Apple, GM or anyone else put beta software on public streets and put lives at risk.
    Sure there is. The many companies now testing these systems remain liable for accidents or deaths they may cause and the benefits to drivers now and in the very near future will be immense. Insurance rates will eventually make it unaffordable to NOT have an autonomous system.
    edited March 2018
  • Reply 24 of 47
    tzeshantzeshan Posts: 2,351member
    So Uber's self-driving cars have now recorded one fatality in around 3 million miles, whereas the current U.S. rate is 1.16 people for every 100 million miles driven, according to figures reported by Wired.”

    I like this statistics. 
  • Reply 25 of 47
    There is no compelling reason to let Uber, Apple, GM or anyone else put beta software on public streets and put lives at risk.
    So how do you test the software so it can come out of beta?  There's a REASON why you have to have a person in the car.  Waymo and Tesla have both done modifications to their self-driving vehicles to promote driver awareness.
  • Reply 26 of 47

    paulcons said:

    I blieve that it is perfectly legal in CA to "test" an autonomous car without a human driver in the vehicle. At the very least they bloodly well can put a human watchdog on them... they sure as hell have an overabundance of money to do that.

    You would be incorrect.  NO municipality allows vehicles on roads without a driver present in the vehicle.  Some vehicles may have abbreviated controls (some of the Waymo bubble-cars had no steering wheels, but were limited to 25 mph) but there is always a big red Kill Switch on the dashboard and there has to be a person in the car who is charged with monitoring the vehicle at all times while it's moving.
  • Reply 27 of 47

    georgie01 said:
    Moir went on to predict that for that reason, Uber would be unlikely to be found at fault for the crash, although she wouldn't rule out charges against the car's test driver. 
    This disturbs me. Why would Uber be ruled out but not the test driver?

    This attitude that the car has less culpability is dangerous and naive. People made the car and made the capabilities of the car, people approved these cars for the road, and these people’s errors may result in the death of another human. This car was driving, monitored or not, and if the human driver could have avoided the fatality then the car could have too.

    It's not a matter of culpability, it's a matter of responsibility.  The flying industry has the concept of the PIC, or Pilot In Charge, and ships have a Captain, both of which are the ultimate bearer of responsibility.  In the case of vehicles, that person is the driver, regardless of the level of assist the vehicle provides to the driver.  No autonomous driving vehicle takes AWAY the ability of the driver to control the vehicle and override the autonomous driving protocols.  

    Even if Uber's software IS at fault, which doesn't appear to be the case, it is still possible that the DRIVER of the car could have avoided the situation and might bear liability for the crash.

    In any case, having lived with Apple, Google, and Waymo self-driving cars on the road for a while, I've learned to be somewhat wary when one of these vehicles (they tend to be pretty easily identifiable, as they all sport gear on the roof that makes the Ghostbusters Cadillac look streamlined) is driving, not because they don't drive well (they do a better job than most drivers on the road) but because, well, humans can screw up anything and I don't want my car to be sideswiped by a bug.
  • Reply 28 of 47
    davidwdavidw Posts: 2,053member
    A human would've realized that there was someone dismounting the bike and getting ready to cross. If they see that the subject was not paying attention, they may honk the horn or start to slow down. Lidar is a machine that detects objects and runs algorithms, it has no sense of empathy or human sequence prediction. The technology will be ready, but it's not ready yet.

    That city had about 50 pedestrian deaths a year, autonomous cars amount to a fraction of a fraction of the total number of cars driving those streets, and yet it managed to already kill someone.

    I do realize there was a guy in the car, and he probably was not paying attention and became complacent. I have driven for hundreds of miles in cars with brake and lane assists, and you quickly start to rely on them and not pay attention. I can only imagine how much of that effect a totally autonomous car would have on a driver.
    Of course you mean a human might have realized there was some one there (and thus maybe could have avoided getting into the accident) providing the human driver at the time was not DUI, texting or talking on a cell phone, changing the radio station, looking at the GPS, momentarily blinded by headlights, trying to adjust the passenger side mirror, lighting a cigarette, yelling at the kids in the back seat, drowsy, admiring the sports car in the next lane, trying to hit the high note of the song he's singing along with, looking for a street sign or address, looking over his shoulder while getting ready to change lane, speeding up a little to make the green light ahead, still trying to make sense of the personal license plate he just saw, looking at the woman in the red dress walking on the sidewalk, etc.. All human distractions when driving a car, whether the car has any form of self-driving ability or not. Every year, hundreds of pedestrians have been killed in marked crosswalks, even while crossing on a green light, due to some of these human distractions while driving.

     A car going 40 MPH is traveling about 58 feet per second. All it takes is a half a second of distraction to travel 30 feet, plus another 10 to 15 feet the car will travel during the reaction time it takes to get your foot off the accelerator and put it on the brake peddle. At the very least, a self-driving car will cut way down on that 10 to 15 feet a car travel due to human reaction time, (at 40 MPH). And some times, that 10 to 15 feet is all it takes to save a life. In this case, it seems that the car didn't even have that 10 to 15 feet of reaction time, let alone the stopping distance it would need, to avoid the collision.  

    It's stated that the self-driving car did not slow down before the collision. Assuming it didn't have time to apply the brakes. But I'm wondering if the self-driving car try to at least avoid the collision by swerving into the next lane, if that lane was clear. When face with something suddenly appearing in front of you while driving, sometimes applying  the brakes is not the best option because it will limit your ability to try to avoid a collision, by attempting to steer clear of it. (More so now of days with most cars being front wheel drive. And specially if the road is wet.) Or was it that the self-driving car didn't even have enough time to react in any way t all, to avoid the collision. 

    A totally autonomous car don't need or have a driver. If it did, it wouldn't be "totally" autonomous. Some companies are already designing autonomous cars without a steering wheel or pedals.
    Soliartdent
  • Reply 29 of 47
    MplsPMplsP Posts: 3,931member
    Seems pretty premature for the police chief to make a call on fault and liability. We need to see the results of the full investigation first.

    It's very possible that a fully focused driver would have seen and/or expected the pedestrian to come out. It's also very possible they would have been looking elsewhere at the exact moment they needed to be looking at the pedestrian. Who knows what would have happened. The whole bit about the previous felony conviction of the test driver seems totally irrelevant, though. 
  • Reply 30 of 47
    My take is that the Tesla collision was definitely a software (or combination of software/hardware) fault. The car was not able to understand that there was a truck in front. Obviously the human driver watching a DVD was not helping, either.

    In this case the video seems to point that even a human would have been unable to stop the car or avoid the collision, hence the police statement.

    I fully believe that autonomous cars will be (maybe they already are) safer than human drivers. This incident is  - apparently - not proving otherwise.
  • Reply 31 of 47
    I'm going out on a limb here, but if the car's sensors didn't detect the woman entering the roadway (Volvo's default systems do such things already), meaning she was probably behind an obstacle, there's no way the driver would've seen her, either. 

    IIHS lists City Safe as standard equipment starting with the redesign for 2016, which means all Uber Volvo XC90s are equipped with it.

    Volvo City Safe: Starts at 4:35 in the video.

  • Reply 32 of 47
    gatorguygatorguy Posts: 24,213member
    My take is that the Tesla collision was definitely a software (or combination of software/hardware) fault. The car was not able to understand that there was a truck in front. Obviously the human driver watching a DVD was not helping, either.

    In this case the video seems to point that even a human would have been unable to stop the car or avoid the collision, hence the police statement.

    I fully believe that autonomous cars will be (maybe they already are) safer than human drivers. This incident is  - apparently - not proving otherwise.
    You apparently did not read any of the official reports on the accident? IIRC it's was shown that the Tesla software properly recognized minutes earlier that's its sensor's were unreliable in those weather circumstances and warned the driver both audibly (six times prior to the accident, and visually seven times) to take the steering wheel and vehicle command. So what did he do? He foolishly manually increased his speed and kicked back with his hands off the wheel.

    So yeah the software recognized it would not be able to "see" roadway obstacles and prompted the driver to disengage auto-systems and take direct control. Several times and significantly before the accident occured. Tesla has now changed their software to disengage all autonomous features entirely until the car is restarted if a driver ignores repeated warnings about controlling his/her vehicle.


    edited March 2018 raulcristianSoli
  • Reply 33 of 47
    gatorguy said:
    My take is that the Tesla collision was definitely a software (or combination of software/hardware) fault. The car was not able to understand that there was a truck in front. Obviously the human driver watching a DVD was not helping, either.

    In this case the video seems to point that even a human would have been unable to stop the car or avoid the collision, hence the police statement.

    I fully believe that autonomous cars will be (maybe they already are) safer than human drivers. This incident is  - apparently - not proving otherwise.
    You apparently did not read any of the official reports on the accident? IIRC it's was shown that the Tesla software properly recognized minutes earlier that's its sensor's were unreliable in those weather circumstances and warned the driver both audibly (six times prior to the accident, and visually seven times) to take the steering wheel and vehicle command. So what did he do? He foolishly manually increased his speed and kicked back with his hands off the wheel.

    So yeah the software recognized it would not be able to "see" roadway obstacles and prompted the driver to disengage auto-systems and take direct control. Several times and significantly before the accident occured. Tesla has now changed their software to disengage all autonomous features entirely until the car is restarted if a driver ignores repeated warnings about controlling his/her vehicle.


    OK, that's comforting to know. Thanks for the info. And I was right about that watching a DVD was not very helpful. :-)

    In any case, in the future the unreliability that the Tesla software signaled will be ironed out. At least I hope so. I really look forward to it.


  • Reply 34 of 47
    anton zuykovanton zuykov Posts: 1,056member
    cia said:
    The June 2016 Tesla death was not a "test driver", it was a customer using Tesla's Autopilot feature who was NOT paying attention to the road.
    That is an understatement. He wasn't not only paying attention. He was watching a movie relying on Tesla autopilot, even thought Tesla specifically states that the autopilot is not ready for that..

    The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.
    https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter


  • Reply 35 of 47
    artdentartdent Posts: 69member
    For what it's worth, the city is Tempe, not Temple.

  • Reply 36 of 47
    SpamSandwichSpamSandwich Posts: 33,407member
    cia said:
    The June 2016 Tesla death was not a "test driver", it was a customer using Tesla's Autopilot feature who was NOT paying attention to the road.
    That is an understatement. He wasn't not only paying attention. He was watching a movie relying on Tesla autopilot, even thought Tesla specifically states that the autopilot is not ready for that..

    The truck driver, Frank Baressi, 62, told the Associated Press that the Tesla driver Joshua Brown, 40, was “playing Harry Potter on the TV screen” during the collision and was driving so fast that “he went so fast through my trailer I didn’t see him”.
    https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter


    Spoiler! The one time that watching a trailer would’ve been a smart idea.
    edited March 2018
  • Reply 37 of 47
    tmaytmay Posts: 6,341member
    Having just watched the video of the Uber car approaching the woman pushing the bicycle, it's obvious to me that the Uber sensors were NOT obstructed by any vegetation, that she was in the process of crossing the lanes in an area lit by a streetlight, and had completed crossing the inner lane prior to being struck while almost having exited the outer lane.

    My expectation for an autonomous vehicle operating at night is that it would have both seen her and responded at the minimum with braking, not to mention evasive maneuvers, yet none of that occurred. Throw in the completely oblivious backup driver, obsessed with his phone, and I would call this a complete failure of autonomy. Completely fail, Uber.

    A link to the video;



  • Reply 38 of 47
    gatorguygatorguy Posts: 24,213member
    Thanks @tmay , great find. 
  • Reply 39 of 47
    dasanman69dasanman69 Posts: 13,002member
    40 MPH seems awfully fast. Looks like it is driving past the headlights.
  • Reply 40 of 47
    AI_liasAI_lias Posts: 434member
    Whoa, after seeing the video, seems like a big failure of the sensors. This lady was right in front, and no braking at all, it seems. Autonomous cars have a ways to go before they'll be mainstream.
Sign In or Register to comment.