Uber drops self-driving operation in Arizona, following fatality

2»

Comments

  • Reply 21 of 32
    cgWerks said:
    ronn said:
    Several experts said that dash cam footage is not the same as real world footage. It appears darker from the car footage that it actually was. Uber F—ed up by lowering the amount of sensors, "tuning" the software too low because of too many previous "false" positives, and halving the amount of safety drivers in their test cars. Of course, had the safety driver been alert, he could have at least swerved or hit the brakes, lessening the impact of the collision with the woman.
    Yeah, my guess is that Uber quickly got that video (doctored, even?) to the police to help sway the initial report... or maybe there was even some pressure to so quickly get that initial 'wasn't Uber's fault' statement out into the mainstream press. And, that's the impression most people have, even though the experts (and people who routinely drive the area) have been weighing in with opposite reports.

    At least it seems the Governor has finally gotten some sense and booted this stuff out of the state.
    Considering the state had absolutely no oversight of Uber, I do wonder how the investigation will go. Probably it'll fall into the too hard basket and we'll hear nothing more of it. But Uber jumped in very quickly and settled the case with the family - no doubt offered several million and washed their hands of it. A pity, I'd have loved to see it go through the courts.
    ronn
  • Reply 22 of 32
    PTSD PTSD Posts: 3unconfirmed, member
    I have a rather contrary viewpoint:

    The woman with the bicycle had it coming, and she deserved what happened. 

    In the USA, ALL bicycles are equipped with reflectors to increase visibility at night.  By law. 

    She removed them, as is clearly shown in the video. 

    She had no lights. 

    She throws herself randomly into rushing traffic, narrowly missed by the car to the Uber’s left. If she had stepped off the curb five seconds earlier, she would have been struck by that human-driven car, and this wouldn’t be a story. Just evolution in action. 

    She was atupid, thoughtless, and I hate the way that this story has developed. 
    DAalsethlkrupp
  • Reply 23 of 32
    MplsP said:
    To be fair, I saw the video and the the woman appeared out of nowhere; I doubt a human driver could have reacted in time. Now if the car had an infrared camera, that should have been able to see the woman, but I haven't seen any report from Uber as to what the sensors picked up and whether it was a failure of programming or sensing.

    Yeah, if this is the accident that was in the news a while back, the person basically darted out in front of the car without warning, and in a place they weren't supposed to be doing so, i.e. not in a crosswalk or other pedestrian.  No one would find it at all remarkable had a human driver killed this woman in the same scenario (nor would that driver have been considered at fault), but because a self-driving car did it, "the technology is too dangerous."
  • Reply 24 of 32
    lkrupplkrupp Posts: 10,557member
    Soli said:
    Uber takes shortcuts and makes ethically questionable decisions that negatively affect people. Shocking¡

    lkrupp said:
    Well it’s starting to look like self-driving cars will go the way of Google Glass.
    No it isn’t.
    From the point of view of a techie they aren’t but in the pubic’s mind they are toast. Just the thought of being in a moving vehicle without a human driver turns people off. Everyone in my social circle hates the very idea and my social circle (retired) is the very demographic this technology would help. Modern commercial jet airplanes can literally fly themselves so why do we still have a pilot and co-pilot in the cockpit? Because human beings wouldn’t get on one without them. And if having an automobile that can drive itself BUT you still need a driver to pay attention just in case then what’s the point? Sure, I would love to get in my car, tell it where I want to go, and then take a nap while en route or watch TV but that’s simply not going to happen anytime soon. What may happen is the technology being deployed on commercial transportation like a Greyhound bus BUT with a driver in the cockpit just like an airplane. Sci-fi writers have been predicting self-driving or flying cars for 75 years or more. No such luck.
    edited May 2018
  • Reply 25 of 32
    lkrupplkrupp Posts: 10,557member
    PTSD said:
    I have a rather contrary viewpoint:

    The woman with the bicycle had it coming, and she deserved what happened. 

    In the USA, ALL bicycles are equipped with reflectors to increase visibility at night.  By law. 

    She removed them, as is clearly shown in the video. 

    She had no lights. 

    She throws herself randomly into rushing traffic, narrowly missed by the car to the Uber’s left. If she had stepped off the curb five seconds earlier, she would have been struck by that human-driven car, and this wouldn’t be a story. Just evolution in action. 

    She was atupid, thoughtless, and I hate the way that this story has developed. 
    I don’t think she “deserved” it but you’re right in one sense. Not even a fully alert human driver could have averted the collision but the media have hyped this into a frenzy about an AI car killing an innocent old lady. In a previous post I said this technology will go the way of Google Glass and I firmly believe that because of things like this.
    edited May 2018
  • Reply 26 of 32
    lkrupp said:
    Well it’s starting to look like self-driving cars will go the way of Google Glass. The public will simply not accept them. Something like 50,000 people in the U.S. are killed each year in automobile accidents and no one bats an eyelash. Yet people are indeed convinced that driving a car is safer than flying in an airplane even though ALL the data and statistics prove otherwise.  All it took was a couple of fatalities for the recriminations and hand wringing over autonomous vehicles to start. Add to that the state of the legal system in this country and you know the injury lawyers are salivating over the prospects. Much easier for a jury to blame a machine than a human. Self-driving cars are a solution to a problem no one cares about and no one believes. Those testing them just don’t know it yet. Apple and others will never EVER convince the general public that these vehicles are safe. Yes, momma, stupid is as stupid does. And you just know that there will those, “Here, hold my beer while I try to fuck over up this auto-car here, Bubba” moments.
     There is zero evidence that self driving cars or the pursuit of perfecting that technology are anywhere near the saturation or use of Google Glass. I own and drive a semi autonomous car that’s saved my life from my terrible driving at least 3x this year but still can’t buy me a set of AR glasses. Your a moron.

    Espeacially since Google was early on Glass not wrong. Apple is 10minutes from dropping their own version of that tech and I might add... 100 people a day in America die by car. Everyday. Autonomous driving tech could drive that number well below 50. If self driving tech killed one American a day for 50 years it would still save lives... to argue that what the American people want or don’t want or fear or not fear is the dumbest articulation of capitalism ever. 

    People will die by self driven cars. It’s a fact. The tabloid style news hunting and pecking for whatever eyeball grabbing event happens is what’s really going on. Uber should not be in any business let alone such a controversial and complex one. And that’s why this ended.

    every Volvo Uber bought was equipped with Emergency braking mitigation - Uber had it disconnected. Uber is at fault for the death. Not the driver, the car or the iPhone, it was a mistake that even Volvo has demanded they be held responsible. Hyundai’s can stop themselve#. Hyundai’s! 

    Uber is dangerous.



  • Reply 27 of 32
    SoliSoli Posts: 10,035member
    lkrupp said:
    Soli said:
    Uber takes shortcuts and makes ethically questionable decisions that negatively affect people. Shocking¡

    lkrupp said:
    Well it’s starting to look like self-driving cars will go the way of Google Glass.
    No it isn’t.
    From the point of view of a techie they aren’t but in the pubic’s mind they are toast. Just the thought of being in a moving vehicle without a human driver turns people off. Everyone in my social circle hates the very idea and my social circle (retired) is the very demographic this technology would help. Modern commercial jet airplanes can literally fly themselves so why do we still have a pilot and co-pilot in the cockpit? Because human beings wouldn’t get on one without them. And if having an automobile that can drive itself BUT you still need a driver to pay attention just in case then what’s the point? Sure, I would love to get in my car, tell it where I want to go, and then take a nap while en route or watch TV but that’s simply not going to happen anytime soon. What may happen is the technology being deployed on commercial transportation like a Greyhound bus BUT with a driver in the cockpit just like an airplane. Sci-fi writers have been predicting self-driving or flying cars for 75 years or more. No such luck.
    That's the problem. You're going off ridiculous fiction. Even near future predictions are wrong, because people never consider how technology alters society which in turn alters society. This notion of countless flying cars in the air without any consideration of all the safety requirements that would have to be set up means you're already not framing the problem correctly

    People thought cruise control was too dangerous… it wasn't. Now we have adaptive cruise control that uses sensors to intelligently determine if your vehicle is moving at a set speed it will slow down if there's an object that comes in front of it. People thought that was too dangerous… it wasn't. We also have sensors that help make sure that the vehicle is staying in its lane. These are all very minor steps that have been part of the automation and safety of the automobile from its inception, and despite your flying car example, new advancements are happing every year and they've all moved from being purely mechanical to all being computer controlled.

    Hell, I'm sure I can find old, anti-tech people that will say they're safer not wearing a seat belt, tell me some (possibly fake, but certainly not researched) story about how an airbag blew while someone was driving which they believe makes all airbags too dangerous, that will say that safer pumping the brakes at their own discretion over relaying on computer controlled ABS, and truly believe they shift more efficiently than a modern automatic transmission.
    edited May 2018
  • Reply 28 of 32
    Don't lump all self driving car projects together. Google has spent the past decade researching, implementing and very carefully testing their system. Uber popped up out of nowhere and clearly faked theirs. Uber should be trying to find those responsible for allowing a car on the road that could ignore a person crossing the road and allowed a driver to be staring at her cell phone instead of watching for danger.
  • Reply 29 of 32
    MplsPMplsP Posts: 3,924member
    I just read the preliminary NTSB report - the woman was wearing dark clothing, she was walking her bike across an unlit/poorly lit portion of the street outside of a crosswalk, the bike had no side reflectors and she had marijuana and methamphetamine in her system.

    The autonomous Volvo had it's factory emergency braking system disabled. The autonomous sensors did detect the woman and determined that breaking was needed 1.3 seconds before hitting her but the emergency braking in the autonomous system had been disabled "to reduce the potential for erratic behavior." 

    So the system functioned as intended, the intentions were just idiotic. Not only did they deactivate the emergency braking (arguably one of the most important systems,) they took a 'driver' and expect them to sit in an autonomous vehicle doing virtually nothing for 8 hours and yet be alert enough to react in a split second. 

    This actually illustrates a bigger issue with autonomous systems - if I'm a bad driver, I crash and kill myself or a few other people. If Uber/Google/Apple/Ford/??? creates a system and programs it incorrectly, it potentially affects thousands of people. This isn't to say we can't use the systems; the same issues apply for aircraft, train systems, etc. It just means the quality control and testing of the systems needs to be thorough. Given the rush and competition to get these systems to market, I'm not terribly confident that there won't be some corners cut somewhere along the way.
    edited May 2018 cgWerks
  • Reply 30 of 32
    gatorguygatorguy Posts: 24,213member
    MplsP said:
    To be fair, I saw the video and the the woman appeared out of nowhere; I doubt a human driver could have reacted in time. Now if the car had an infrared camera, that should have been able to see the woman, but I haven't seen any report from Uber as to what the sensors picked up and whether it was a failure of programming or sensing.
    The news reported today indicates the sensors worked fine and "saw" the woman. It was faulty programming.
  • Reply 31 of 32
    asciiascii Posts: 5,936member
    Human beings have a subconscious mind which handles things we have already learned throughly, such as riding a bike or driving the same route to work each day, and handles it without thinking (essentially a human form of automation). If we can automate it in our own brains surely we can (eventually) automate it in a computer. 

    But the difference with humans is that the executive/thinking function is always available to turn on in a split second to deal with novel situations. Computer won't have that fallback. Auto-drive is like something that is better than you at the stuff you do everyday, but as soon as something novel happens it is a deer in the headlights.

    What is needed is a clever way for such a system to realise when it has encountered a novel situation and hit the brakes and pull over, instead of trying to muddle through. And just saying "Oh, its the responsibility of the human to take over in such situations in unrealistic because executive function will be required so rarely that people will simply not be able to stay focussed. Until such a novelty detection algorithm can be developed these things should stay on the test track.
  • Reply 32 of 32
    cgWerkscgWerks Posts: 2,952member
    franklinjackcon said:
    Considering the state had absolutely no oversight of Uber, I do wonder how the investigation will go. Probably it'll fall into the too hard basket and we'll hear nothing more of it. But Uber jumped in very quickly and settled the case with the family - no doubt offered several million and washed their hands of it. A pity, I'd have loved to see it go through the courts.
    Well, here's the preliminary investigation results... the system detected her 6 seconds before impact
    http://enews.cnet.com/ct/47798078:Wdklq0VNd:m:1:634142495:521785C2F0C1E148FD028907D2CAD286:r:100000000000000000000000000174567

    PTSD said:
    I have a rather contrary viewpoint:
    The woman with the bicycle had it coming, and she deserved what happened. ...  If she had stepped off the curb five seconds earlier, she would have been struck by that human-driven car, and this wouldn’t be a story. Just evolution in action. ...
    Well, have you driven in the real world recently? This is the stuff an AI car will have to deal with, as we all have to each day. People walk unexpectedly into streets from odd places all the time. Sometimes human drivers, indeed, can't avoid hitting them. But, a human driver would have ***EASILY*** avoided hitting this woman.

    I had a near miss this winter when someone (obviously probably on drugs) was leaning in a parked car window having a conversation, and suddenly turned and walked into traffic right in front of me (never even looked over at my sliding to a stop). Know how I missed him? Body language... I noticed it was an odd situation, so started slowing, ready to brake as the situation unfolded. I also noticed him starting to move, etc. An AI vehicle wouldn't have stood a chance.

    beowulfschmidt said:
    Yeah, if this is the accident that was in the news a while back, the person basically darted out in front of the car without warning, and in a place they weren't supposed to be doing so, i.e. not in a crosswalk or other pedestrian.  No one would find it at all remarkable had a human driver killed this woman in the same scenario (nor would that driver have been considered at fault), but because a self-driving car did it, "the technology is too dangerous."
    A human driver wouldn't hit her unless they were texting or otherwise not paying attention. The reason you have this opinion you do, is because that's how Uber, the police, and MSM quickly painted the situation.

    lkrupp said:
    ... Modern commercial jet airplanes can literally fly themselves so why do we still have a pilot and co-pilot in the cockpit? Because human beings wouldn’t get on one without them.  ... Sure, I would love to get in my car, tell it where I want to go, and then take a nap while en route or watch TV but that’s simply not going to happen anytime soon....
    Besides flying an aircraft being considerably easier... heck no I wouldn't be getting on a airplane with no pilot!!!
    Be careful my friend, or you'll end up with the next Darwin award.

    bradford_kirby said:
    ... 100 people a day in America die by car. Everyday. Autonomous driving tech could drive that number well below 50. If self driving tech killed one American a day for 50 years it would still save lives...
    I think you need to run the numbers a bit better there. How many human driver are driving, and how many miles? It's about 1 death per 100 million miles. Have AI cars even racked up enough miles to justify 1 death by human-driven standards (let alone 5 or 6 already)?

    Soli said:
    Hell, I'm sure I can find old, anti-tech people ... that will say that safer pumping the brakes at their own discretion over relaying on computer controlled ABS, and truly believe they shift more efficiently than a modern automatic transmission.
    You had me until you got to this part. Yes, many of the assistive technologies are fantastic, because they help *REAL DRIVERS* drive better. But, you still need a real driver. AI won't ever really be driving (in the sense a human does). It's just following a sophisticated routine (that to some, looks similar), kind of like how Google's new phone-call-maker thingy fools some into thinking it is having a conversation.

    But, yes, you can stop faster without ABS, and shift better than a modern automatic. These technologies are mainly good in panic situations when most humans don't perform as they might have been trained, or more often these days, don't ever have the training in the first place. As I said in previous posts, if we really cared about saving lives, we could do a LOT, LOT better with human drivers to reduce the current deaths to a fraction.

    Don't lump all self driving car projects together. Google has spent the past decade researching, implementing and very carefully testing their system. Uber popped up out of nowhere and clearly faked theirs. Uber should be trying to find those responsible for allowing a car on the road that could ignore a person crossing the road and allowed a driver to be staring at her cell phone instead of watching for danger.
    Yes, I'll give you that much that this situation was particularly bad, even for AI. Uber should be run out of business for this. Insanely irresponsible... no evil. However, the *why* they did it is important. They turned it off because the car drove too erratically with it turned on... the way AI cars do (i.e.: first day driver's ed student on crack). The general public wouldn't be amazed at that, and probably wouldn't want them out in real traffic yet (understandably).

    ascii said:
    Human beings have a subconscious mind which handles things we have already learned throughly, such as riding a bike or driving the same route to work each day, and handles it without thinking (essentially a human form of automation). If we can automate it in our own brains surely we can (eventually) automate it in a computer. 

    But the difference with humans is that the executive/thinking function is always available to turn on in a split second to deal with novel situations. Computer won't have that fallback. Auto-drive is like something that is better than you at the stuff you do everyday, but as soon as something novel happens it is a deer in the headlights.

    What is needed is a clever way for such a system to realise when it has encountered a novel situation and hit the brakes and pull over, instead of trying to muddle through. And just saying "Oh, its the responsibility of the human to take over in such situations in unrealistic because executive function will be required so rarely that people will simply not be able to stay focussed. Until such a novelty detection algorithm can be developed these things should stay on the test track.
    Yeah, kind of. I think you're on the right track here, though I'd imagine that could get dangerous too.
    But, yes, it's kind of a 'one of these things is not like the other' situation.
Sign In or Register to comment.