Pedestrian killed by Uber self-driving car, testing stops in all cities

135

Comments

  • Reply 41 of 91
    mavemufcmavemufc Posts: 326member
    Well that’s just, not good.
  • Reply 42 of 91
    dewmedewme Posts: 5,376member
    This event is sad but also inevitable. In every single application in which people and mechanical/mechatronic automation physically overlap in the same physical space accidents have always occurred and there is always an inherent danger for people  From a safety standpoint self-driving automobiles are similar to trains, subways, elevators, escalators, factory automation machinery/robots/AGVs, etc., in terms of danger to people  The biggest difference of course is that most of these other systems are a bit more clear about delineating areas where the risk to human life can occur if unintended machine-human contact ensues. It should come as no surprise to anyone that walking or parking on an active railroad track is dangerous. Yet people still get killed on railroad tracks not infrequently. Roadways upon which 2-ton subcompacts through 40-ton semis travel on are as dangerous for people as train tracks, are often marked as such, but there seems to be little hesitation for people to enter into roadways with little or suppressed/distracted situational awareness. At least that's what nearly 6,000 pedestrians who were killed by human driven vehicles last year seems to indicate. So now we have one victim of an automated vehicle to add to the list.

    I don't think that all of the danger can be eliminated as long as people and machines share the same physical space. I do think that with improvements in sensor technology, data fusion, and AI automated vehicles will get substantially safer over time and pedestrian related accidents with automated vehicles will drastically decrease. On the other hand, pedestrian related accidents caused by human driven vehicles continues to get worse year after year. I don't know how human drivers can be systematically improved so they cause fewer pedestrian deaths. As long as there is still a human in the loop there will always be an inherent danger.
    radarthekatpscooter63baconstangbeowulfschmidt
  • Reply 43 of 91
    anomeanome Posts: 1,533member
    tundraboy said:
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.
    That's beside the point.  All these companies are testing deadly technology on public roadways without the permission of all the other people using those roadways.  They are putting people in danger without those people's permission or even knowledge.

    Actually, that's not strictly true. If they have been given permission by the authorities, then they have the permission of the other people on the road. That's how representative democracy works. You elect people to decide these things. Even where the ultimate decision is made by a non-elected official (such as someone at the DMV) the framework for deciding what's acceptable, and the rules around letting these things on the road are established by legislation which is enacted by the elected officials of your local, state, or federal legislature.

    That said, Uber do have a history of dodging relevant rules and legislation. If they didn't have permission to be doing this, then they will presumably be in a world of trouble.

    "Knowledge" is a bit trickier. They do have an obligation to let people know this is happening. Unfortunately, media coverage, public service announcements, and even massive billboards all over the place can only go so far. Even with all that, there may be people unaware that there are driverless cars in their neighbourhood.

    radarthekatpscooter63
  • Reply 44 of 91
    SpamSandwichSpamSandwich Posts: 33,407member
    As a pedestrian I've almost been hit crossing the street (legally, I should add) 3 times over the past several years... and that's WITHOUT an autonomous vehicle being involved!
    edited March 2018 radarthekatjbdragon
  • Reply 45 of 91
    jbdragonjbdragon Posts: 2,311member
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.
    There's also MILLIONS more cars on the road driven by Humans. A self driving car shouldn't be distracted in any number of ways like a Human also.
    spinnyd
  • Reply 46 of 91
    NemWanNemWan Posts: 118member
    Logically, self-driving cars don't have to be perfect, they just have to kill fewer people per miles driven than human drivers.
    MisterKitradarthekatbaconstangStrangeDays
  • Reply 47 of 91
    SpamSandwichSpamSandwich Posts: 33,407member
    NemWan said:
    Logically, self-driving cars don't have to be perfect, they just have to kill fewer people per miles driven than human drivers.
    And so far, they are killing fewer people per mile.
  • Reply 48 of 91
    maestro64maestro64 Posts: 5,043member
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.

    Humans are fallible, machines are not allow to make mistake or have accident, most time you can forgive a human but can never forgive a machine. 
    muthuk_vanalingam
  • Reply 49 of 91
    maestro64maestro64 Posts: 5,043member
    First pedestrian killed by a self-driving car. Way to go Uber!
    Uber is just trying to demonstrate how much safer it is to travel in an Uber than on foot.

    Actually the self driving program demonstrated its prime directive which is save the passenger under its care at all cost. The Decision was kill the person on the street or kill the passengers, you know which choose it made. 
  • Reply 50 of 91
    maestro64maestro64 Posts: 5,043member
    tundraboy said:
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.
    That's beside the point.  All these companies are testing deadly technology on public roadways without the permission of all the other people using those roadways.  They are putting people in danger without those people's permission or even knowledge.  This is the result of Silicon Valley beta-testing mentality being applied to a product that should not be subject to such mentality.  Here are the things that should be happening but aren't happening with respect to self-driving technology:

    1.  Self-driving tech should be more thoroughly tested in a closed course with realistically simulated vehicle and pedestrian traffic before they are unleashed on public roads.
    2.  But even before closed-course testing is authorized, the feds should research then set up performance and safety standards for closed-course testing.  Standards that should be achieved before vehicles are allowed to use public roads.
    3.  Vehicles that are then tested on public roads should be clearly marked and be required to follow a strict protocol as to routes, hours, environmental conditions, and expansion thereof.  With progression to more complex routes and environments, and more hours contingent on achieving properly calibrated safety and performance benchmarks.


    First, humans are not held to this standard and we let humans driving in the wild with without demonstrating any level of proficiency in driving under all road conditions, why hold the machine to a higher standard than we do with ourselves. Thus the reason we have 40,000 deaths each year.

    There is only one test I want to see a self driving car pass is to win a NASCAR race. Especially ones with a crashes.

    edited March 2018 pscooter63StrangeDaysspinnyd
  • Reply 51 of 91
    georgie01georgie01 Posts: 436member
    The thing that concerns me most about this is that there doesn’t seem to be a mention about who’s responsible for her death. Just because it was a computer driving doesn’t mean someone isn’t responsible, and in this case those responsible are the people who developed the product. The car didn’t create itself...it’s errors are due to the errors of its creators.

    If companies developing these cars were actually responsible for people’s deaths like they should be we would undoubtedly end up with considerably safer autonomous cars.
  • Reply 52 of 91
    maestro64maestro64 Posts: 5,043member
    georgie01 said:
    The thing that concerns me most about this is that there doesn’t seem to be a mention about who’s responsible for her death. Just because it was a computer driving doesn’t mean someone isn’t responsible, and in this case those responsible are the people who developed the product. The car didn’t create itself...it’s errors are due to the errors of its creators.

    If companies developing these cars were actually responsible for people’s deaths like they should be we would undoubtedly end up with considerably safer autonomous cars.

    Good question, when a bridge it built, the PE (Professional Engineer) must sign off on the design and is responsible for its construction. If the Bridge falls down, think FL last week. someone is going to be held responsible for the deaths and it could be the PE, most likely not but the company who put up the bridge. I do not think their is a requirement to have PE signoff on the design. In this case should the programmer be held responsible it the program was at fault and it failed to detect the pedestrian.

    Something else to think about, many time when pedestrians we hit, the pedestrian are usually at fault since even though they have right of way. Many times Pedestrians cross against the light or did something else wrong. I suspect in this case even if the pedestrian was 100% in the wrong, the company responsible for the car will be held responsible. 
    baconstang
  • Reply 53 of 91
    twistsol1 said:
    Then again, 40,200 people were killed in 2016 by human drivers.
    And portion of those stats came from them killing themselves....
    edited March 2018 baconstang
  • Reply 54 of 91
    gatorguygatorguy Posts: 24,213member
    georgie01 said:
    The thing that concerns me most about this is that there doesn’t seem to be a mention about who’s responsible for her death. Just because it was a computer driving doesn’t mean someone isn’t responsible, and in this case those responsible are the people who developed the product. The car didn’t create itself...it’s errors are due to the errors of its creators.

    If companies developing these cars were actually responsible for people’s deaths like they should be we would undoubtedly end up with considerably safer autonomous cars.
    It's entirely possible the pedestrian was totally at fault. 
    baconstangMisterKit
  • Reply 55 of 91
    First pedestrian killed by a self-driving car. Way to go Uber!
    Uber is just trying to demonstrate how much safer it is to travel in an Uber than on foot.
    Uber (to a pedestrian): Wanna a ride?
    Pedestrian: Nah, I am fine just walking.
    Uber: it would be a shame if something were to happen to you, while you do....
    edited March 2018 radarthekatjbdragonbaconstanganomebloggerblogSpamSandwich
  • Reply 56 of 91
    maestro64maestro64 Posts: 5,043member
    Folk, not sure anyone else notice, this was not a pedestrian, it was cyclist. From the damage to the car, it looks like the cyclist head it the passenger side front hood of the car. I think this is a little more complicated on what happen.
  • Reply 57 of 91
    stevenozstevenoz Posts: 314member
    Way-no-mo.
  • Reply 58 of 91
    radarthekatradarthekat Posts: 3,843moderator
    Here’s my take on what Apple should do, and it doesn’t involve building vehicles or developing autonomous driving technology... 

    The car of the future is already here.  It's called a Smartphone.  Think about it.  If you were to clear the slate, look at the modern world and ask yourself, how would I design a transportation system given existing and soon-to-come technologies, like autonomous driving, real-time availability scheduling, route optimization, etc, no way you'd conclude there should be a car, or two, in every garage.  You'd create a technology/software infrastructure to allow individuals to call up the transportation they need (car, truck, van, etc) on-demand.  And it would show up wherever they are, or wherever they are going to be, when it's needed.  You'd be able to schedule transportation in advance, like the airport shuttles of yesteryear that you'd schedule a week in advance. Über pretty much killed that business, I expect.  

    Or schedule recurring transportation, such as to take the kids to soccer practice and back.  In this case the transportation technology system might suggest a shared van service, that knows the schedules for local after school sports practice and offers up and constructs pick-up and drop-off routes based upon participation; a regular route to gather up the kids and deliver them.  Accommodation for security will be considered when children are being transported without accompanying parents, such as real-time tracking and a constant open line of communication, both audio and video streaming from the vehicle to parent's smartphones. 

    The specific vehicle that arrives can be determined by number of passengers, whether you'll be transporting something large or just yourself, etc.  The notion of owning, maintaining, accommodating parking requirements of, insuring, etc, a personal vehicle, for many people, has already begun to feel like 'the old paridigm.'  

    To create this infrastructure, you need route optimization software, that incorporates the real-time whereabouts of all vehicles in a local fleet. You need scheduling software.  You need to deal with remaining charge/range of each vehicle out in service to know when a vehicle can accommodate an additional requested or scheduled route without running out of juice.  You need to accommodate stand-by, where the vehicle drops someone off at a location and is requested to stand-by for an indeterminate time while the person goes into a store or bank to run an errand, because perhaps they have packages or groceries left behind in the vehicle.  In short, you need a very sophisticated set of interacting technologies to accommodate smooth operation of a transportation network that provides near immediate responsiveness to a population's constantly fluctuating needs.

    If I were Tim Cook, this is exactly the way I'd envision the future, and this is what I'd set out to create.  It's not so much about constructing vehicles yourself, but about getting sign-in from all vehicle manufacturers such that their vehicles can work within the envisioned transportation network.  And that means that people who do own vehicles could lend them into their local autonomous transportation fleet in order to earn money (this has already been suggested by Musk and makes sense for a maker of vehicles to accommodate, as it helps him sell more Teslas direct to consumers).  It means that new rental fleets will simply be staged in large metro areas, with one or more depots that the vehicles come back to for recharging, maintenance, cleaning, etc.  And that means that there's a path forward for the rental companies, because they already have staging areas for their existing fleets.  The big picture can be accommodated during a transition phase from the world we have today to a world where almost all transportation is shared and autonomous.  

    Extend this to trucking, inter-city bussing, etc, and the whole thing becomes a future that Apple could play a major role in developing.  Without ever producing, on their own, a single vehicle.

    Also key to this is that everything Apple needs to do to revolutionize transportation does not require Apple to do any work on autonomous driving, nor does Apple need to build a single vehicle model.  Nope, Apple will want to own the end user interaction used to summon and schedule transportation, and it'll want to own the route optimization algorithms and server side scheduling and dispatch.  And take a cut of every ride.  

    There will need to be some tech in each car to pick up the user interaction that began on a rider's smartphone or watch/wearable, once the car arrives to pick up the rider.  The car will need a voice interface to interact with the rider.  The car will need to constantly ping its whereabouts to the dispatch and scheduling servers, along with its charge level, so that the dispatch system can determine its next pick up and determine when it needs to exit the active fleet and return to a nearby depot for recharging or maintenance.  The car will need to contain sensors, like internal cameras, to monitor for left-behind packages, spilled coffee, dangerous items, etc, and report appropriately to riders or to dispatch.  The car will need streaming audio/video capabilities to stream to parents when children are riding without adult accompaniment.  All of this can be designed as a set of interfaces that automakers can implement in order to be compatible with Apple's dispatch and routing servers, and the vehicles might also be required to utilize Apple's mapping infrastructure.  

    Once verified as able to serve a ride request, the car is handed details on the location of the rider, and the rider's destination, and it can then utilize its own autonomous driving capabilities to serve the request.  And all of this can integrate both driverless and human driven vehicles into the same service.  So as vehicles are developed that are licensed for autonomous operation, these can be added to an existing Uber-like fleet of human driven vehicles, both serving together to form a centrally requested and directed/dispatched swarm serving a metrolitan area.  Eventually, the human driven vehicles would all be replaced with autonomous vehicles, and the future will have arrived. 

  • Reply 59 of 91
    jbdragonjbdragon Posts: 2,311member
    maestro64 said:
    First pedestrian killed by a self-driving car. Way to go Uber!
    Uber is just trying to demonstrate how much safer it is to travel in an Uber than on foot.

    Actually the self driving program demonstrated its prime directive which is save the passenger under its care at all cost. The Decision was kill the person on the street or kill the passengers, you know which choose it made. 
    Sounds you're saying it's KARR and not KITT.
  • Reply 60 of 91
    georgie01 said:
    The thing that concerns me most about this is that there doesn’t seem to be a mention about who’s responsible for her death. Just because it was a computer driving doesn’t mean someone isn’t responsible, and in this case those responsible are the people who developed the product. The car didn’t create itself...it’s errors are due to the errors of its creators.

    If companies developing these cars were actually responsible for people’s deaths like they should be we would undoubtedly end up with considerably safer autonomous cars.
    I guess we're about to find out the answer to this. They'll need to start by deciding if the pedestrian was at fault or not - if she stepped out in front of the car, in a 45mph zone, in low light and not on a cross walk, I suspect that Uber won't be held to account. If she stepped out with some distance between her and the car and the Uber didn't even slow down, then there will be a line of lawyers offering their services to the family.
    baconstang
Sign In or Register to comment.