Pedestrian killed by Uber self-driving car, testing stops in all cities

1235»

Comments

  • Reply 81 of 91
    gatorguygatorguy Posts: 24,213member
    spice-boy said:
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.
    Certainly 40, 200 deaths is terrible but how many human driven cars are on our roads at any one time compared to the amount of self-driving cars? My guess the ratio of deaths per self-driving cars is a lot higher. Technology companies are desperate to give something nobody is asking for. 
    There's only been two reported "death by autonomous" so far and neither are the fault of the vehicle. In fact one, Tesla, was actually semi-autonomous with the driver ignoring repeated vehicle warnings to take active control. So no there have not been comparably more deaths contributed to autonomous cars than by human drivers. Waymo for instance has driven over 4 million miles so far without being involved in any deaths or even serious injury. 
    edited March 2018
  • Reply 82 of 91
    dasanman69dasanman69 Posts: 13,002member
    maestro64 said:
    tundraboy said:
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.
    That's beside the point.  All these companies are testing deadly technology on public roadways without the permission of all the other people using those roadways.  They are putting people in danger without those people's permission or even knowledge.  This is the result of Silicon Valley beta-testing mentality being applied to a product that should not be subject to such mentality.  Here are the things that should be happening but aren't happening with respect to self-driving technology:I y

    1.  Self-driving tech should be more thoroughly tested in a closed course with realistically simulated vehicle and pedestrian traffic before they are unleashed on public roads.
    2.  But even before closed-course testing is authorized, the feds should research then set up performance and safety standards for closed-course testing.  Standards that should be achieved before vehicles are allowed to use public roads.
    3.  Vehicles that are then tested on public roads should be clearly marked and be required to follow a strict protocol as to routes, hours, environmental conditions, and expansion thereof.  With progression to more complex routes and environments, and more hours contingent on achieving properly calibrated safety and performance benchmarks.


    First, humans are not held to this standard and we let humans driving in the wild with without demonstrating any level of proficiency in driving under all road conditions, why hold the machine to a higher standard than we do with ourselves. Thus the reason we have 40,000 deaths each year.

    There is only one test I want to see a self driving car pass is to win a NASCAR race. Especially ones with a crashes.

    That's like saying "your son/daughter can only play in little league baseball if they can play on a MLB level" 
  • Reply 83 of 91
    dasanman69dasanman69 Posts: 13,002member
    maestro64 said:
    Folk, not sure anyone else notice, this was not a pedestrian, it was cyclist. From the damage to the car, it looks like the cyclist head it the passenger side front hood of the car. I think this is a little more complicated on what happen.
    From what I understand she was walking the bike. 
  • Reply 84 of 91
    asdasdasdasd Posts: 5,686member
    sflocal said:
    How many years have autonomous vehicles been on the road without killing anyone?

    Meanwhile, 3,500 people per day die from non-autonomous vehicles.

    While it is tragic, this is yet another hyped, over-sensationalized article by the media (including AI) that as usual lacks any details as to the events that happened.

    waiting for the reports to come out would be way too much work.  Just post, and retract later.
    They've killed two people. You need to compare deaths per mile to do this accurately. Even a few deaths for autonomous vehicles will skew the picture. 

    Personally I don't think that version 1.0 of any autonomous car will be safe. 
  • Reply 85 of 91
    gatorguygatorguy Posts: 24,213member
    asdasd said:
    sflocal said:
    How many years have autonomous vehicles been on the road without killing anyone?

    Meanwhile, 3,500 people per day die from non-autonomous vehicles.

    While it is tragic, this is yet another hyped, over-sensationalized article by the media (including AI) that as usual lacks any details as to the events that happened.

    waiting for the reports to come out would be way too much work.  Just post, and retract later.
    They've killed two people. 
    One. The Tesla driver killed himself by totally ignoring the warnings from the car to take full control, 13 times as a matter of fact. Not only did he NOT take control of operations he manually increased his speed even more and went back to watching his movie. 
    edited March 2018 SpamSandwich
  • Reply 86 of 91
    anomeanome Posts: 1,533member
    Rayz2016 said:
    Ciprol said:
    Ok, Tempe police have now come out with their initial assessment and stated the Uber vehicle was not at fault. The accident occurred under sub-optimal lighting condition and it was the pedestrian who inappropriately walked onto the roadway, away from established crossings, describing the pedestrian as walking out from the shadows.

    https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-chief-says-early-probe-12765481.php

    Will this be the end of Uber and autonomous vehicle bashing? I suspect not.
    Mmmm. 

    That mght explain why the Uber driver didn’t have time to stop the car. 
    adm1 said:
    That clarifies the situation a LOT, also explains why there was a bicycle in all the press images.

    Sounds like an unavoidable accident totally unrelated to autonomous cars.

    Yeah, having watched the footage, there really is no hope of a human driver seeing the pedestrian in time. It was badly lit, they were wearing dark clothing, etc..

    One can't help but feel that the car's LIDAR should have been able to see them, though. I still don't really see that the car is to blame. (Nor do I wish to blame the victim.) If nothing else, this is a teaching moment for self-driving technology, it's a great shame it cost someone their life, however.

    [EDIT] Just seen that Ars Technica has an article saying the video of the accident shows problems for Uber re:liability. It seems to be what I mentioned above - the LIDAR system should have seen the victim long before she got in front of the car.

    I expect there will be lawsuits flying around on this for ages yet.

    edited March 2018
  • Reply 87 of 91
    asdasdasdasd Posts: 5,686member
    gatorguy said:
    asdasd said:
    sflocal said:
    How many years have autonomous vehicles been on the road without killing anyone?

    Meanwhile, 3,500 people per day die from non-autonomous vehicles.

    While it is tragic, this is yet another hyped, over-sensationalized article by the media (including AI) that as usual lacks any details as to the events that happened.

    waiting for the reports to come out would be way too much work.  Just post, and retract later.
    They've killed two people. 
    One. The Tesla driver killed himself by totally ignoring the warnings from the car to take full control, 13 times as a matter of fact. Not only did he NOT take control of operations he manually increased his speed even more and went back to watching his movie. 
    Have you got a link to that?

    We are talking about cars that are going to be autonomous some day. That kind of thing, handing back to the driver when in trouble, immediately negates their usefulness. 
  • Reply 88 of 91
    SpamSandwichSpamSandwich Posts: 33,407member
    anome said:
    Rayz2016 said:
    Ciprol said:
    Ok, Tempe police have now come out with their initial assessment and stated the Uber vehicle was not at fault. The accident occurred under sub-optimal lighting condition and it was the pedestrian who inappropriately walked onto the roadway, away from established crossings, describing the pedestrian as walking out from the shadows.

    https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-chief-says-early-probe-12765481.php

    Will this be the end of Uber and autonomous vehicle bashing? I suspect not.
    Mmmm. 

    That mght explain why the Uber driver didn’t have time to stop the car. 
    adm1 said:
    That clarifies the situation a LOT, also explains why there was a bicycle in all the press images.

    Sounds like an unavoidable accident totally unrelated to autonomous cars.

    Yeah, having watched the footage, there really is no hope of a human driver seeing the pedestrian in time. It was badly lit, they were wearing dark clothing, etc..

    One can't help but feel that the car's LIDAR should have been able to see them, though. I still don't really see that the car is to blame. (Nor do I wish to blame the victim.) If nothing else, this is a teaching moment for self-driving technology, it's a great shame it cost someone their life, however.

    [EDIT] Just seen that Ars Technica has an article saying the video of the accident shows problems for Uber re:liability. It seems to be what I mentioned above - the LIDAR system should have seen the victim long before she got in front of the car.

    I expect there will be lawsuits flying around on this for ages yet.

    The rabbit hole goes even deeper:  https://jalopnik.com/toyota-and-nutonomy-also-halt-autonomous-testing-after-1823942387
  • Reply 89 of 91
    gatorguygatorguy Posts: 24,213member
    anome said:
    Rayz2016 said:
    Ciprol said:
    Ok, Tempe police have now come out with their initial assessment and stated the Uber vehicle was not at fault. The accident occurred under sub-optimal lighting condition and it was the pedestrian who inappropriately walked onto the roadway, away from established crossings, describing the pedestrian as walking out from the shadows.

    https://www.sfchronicle.com/business/article/Exclusive-Tempe-police-chief-says-early-probe-12765481.php

    Will this be the end of Uber and autonomous vehicle bashing? I suspect not.
    Mmmm. 

    That mght explain why the Uber driver didn’t have time to stop the car. 
    adm1 said:
    That clarifies the situation a LOT, also explains why there was a bicycle in all the press images.

    Sounds like an unavoidable accident totally unrelated to autonomous cars.

    Yeah, having watched the footage, there really is no hope of a human driver seeing the pedestrian in time. It was badly lit, they were wearing dark clothing, etc..

    One can't help but feel that the car's LIDAR should have been able to see them, though. I still don't really see that the car is to blame. (Nor do I wish to blame the victim.) If nothing else, this is a teaching moment for self-driving technology, it's a great shame it cost someone their life, however.

    [EDIT] Just seen that Ars Technica has an article saying the video of the accident shows problems for Uber re:liability. It seems to be what I mentioned above - the LIDAR system should have seen the victim long before she got in front of the car.

    I expect there will be lawsuits flying around on this for ages yet.

    The rabbit hole goes even deeper:  https://jalopnik.com/toyota-and-nutonomy-also-halt-autonomous-testing-after-1823942387
    Huh. So Toyota (among some others) is in bed somehow with Uber? That's the implication from the article. Thanks, interesting read. 

    EDIT: Playing follow the links reveals more fun stuff...
    https://www.bloomberg.com/news/articles/2018-03-19/volvo-toyota-steering-clear-of-questions-on-uber-crash-s-impact

    edited March 2018
  • Reply 90 of 91
    gatorguygatorguy Posts: 24,213member
    asdasd said:
    gatorguy said:
    asdasd said:
    sflocal said:
    How many years have autonomous vehicles been on the road without killing anyone?

    Meanwhile, 3,500 people per day die from non-autonomous vehicles.

    While it is tragic, this is yet another hyped, over-sensationalized article by the media (including AI) that as usual lacks any details as to the events that happened.

    waiting for the reports to come out would be way too much work.  Just post, and retract later.
    They've killed two people. 
    One. The Tesla driver killed himself by totally ignoring the warnings from the car to take full control, 13 times as a matter of fact. Not only did he NOT take control of operations he manually increased his speed even more and went back to watching his movie. 
    Have you got a link to that?

    We are talking about cars that are going to be autonomous some day. That kind of thing, handing back to the driver when in trouble, immediately negates their usefulness. 
    You're correct that having to disengage often kind of negates the usefulness. Uber though is reportedly one of the worst. They've been struggling to do 13 miles without a live driver having to take control. In comparison Waymo reports that in their California testing last year they drove an average of 5600 miles per vehicle without live driver intervention
  • Reply 91 of 91
    This is a big mistake. 

    And no, I don't mean the fatal accident... though obviously mistakes were made there too. 

    Autonomous driving will, overall, save many... many... many more lives than the status quo. It will also save fuel. And it will create huge advantages in many different industries, even create new industries. 

    It's terribly sad that someone died in this accident. And terribly sad for the woman who was in the car, keeping watch on the system. She must feel awful. I do not view that person who was killed as some kind of sacrificial lamb to progress. 

    But the facts are clear, this technology is coming and it promises very large improvements. Including exactly the kinds of improvements that will... and even can now... prevent exactly these kinds of accidents from being a lot more common. 

    What's more, part of introducing a technology is, of course, about finding and fixing the flaws. But part of it is also about acclimating the public to how it works. In this accident, a pedestrian was crossing at no crosswalk and without any light. Even on a road with all human drivers, perhaps especially so, she courted high risk.

    Banning autonomous cars now would be like banning toasters, because someone in the early days used one in the bathtub.

    It's a reactionary response and not a very sensible one, in my opinion. 

    P.S. Yeah, I also like using Uber... and think they're a generally good innovation too, as well as an inevitable one. 
Sign In or Register to comment.