AI 'drivers' will gain same legal status as human drivers in autonomous vehicles, NHTSA rules

24

Comments

  • Reply 21 of 74
    maestro64maestro64 Posts: 5,043member
    volcan said:
    In most situations self driving cars would be far safer, primarily because they will not break the law. Human drivers can be pretty dangerous. Speeding, illegal turns, texting and cellphone usage while driving, drunk driving, tail gating, etc, etc. Personally, I obey all traffic laws. Obnoxious people driving illegally and irresponsibly are really annoying. I would be willing to give up my enjoyment of driving a fine automobile just to get rid of all the reckless drivers out there, so bring on the self driving cars.

    Think about these self driving cars interacting with the these unpreditable human, who do you think will win. I'll give you an example in most DUI accidents who lives and dies most time. Computers today still do not have the ability to deal with the unpreditable. Humans have the ability to respond and react to things they not dealt with in the past. I personally avoided bad situations when I realize others around me were not driving the way they should, it is suddle things but it adds up, the self driving cars lack this ability.
  • Reply 22 of 74
    jony0jony0 Posts: 378member

    bsimpsen said:
    So when Google’s system is responsible for, say, five or more crashes, total, will the AI have its license revoked?

    Or will it just continue to be allowed to endanger people and remain in vehicles?

    I’ll bet the latter.
    In 2010, more than 30,000 died  and 2.24 million were injured in 5.4 million auto accidents in the US.

    (https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year)

    So, if the threshold for licensing is "better than a human", your estimate is off by six decimal places.

    A reasonable metric for licensing should be accidents per driven mile. Google's cars already beat humans by that metric. Granted, Google has limited its autonomous cars to relatively benign environments, and I suspect the licensing will restrict operation to proven operating envelopes. But, the rate of improvement of the technology will expand the envelope rapidly over the next decade. I think the outcome of the experiment will eventually be humans losing their right to drive, hopefully after losing their desire to.
    Agreed. I fully expect the obvious outcome that AI will be much more reliable than humans in the near future. Although it will require quite sophisticated technology and we may not be quite there yet, the bar is not really that high, considering unlawful conduct and including DUI. Self-driving trucks are already legal on highways in at least 2 states, albeit with a human ready to take over and I suspect and hope that will also be the first implementation of cars as well.
    However rest assured that the very first accident caused by AI will have the authorities issue an immediate moratorium for all AI pending a long investigation while humans will be left to freely continue their perennial and unavoidable slaughter on our roads, statistics be damned.
  • Reply 23 of 74
    maestro64maestro64 Posts: 5,043member
    bsimpsen said:
    So when Google’s system is responsible for, say, five or more crashes, total, will the AI have its license revoked?

    Or will it just continue to be allowed to endanger people and remain in vehicles?

    I’ll bet the latter.
    In 2010, more than 30,000 died  and 2.24 million were injured in 5.4 million auto accidents in the US.

    (https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year)

    So, if the threshold for licensing is "better than a human", your estimate is off by six decimal places.

    A reasonable metric for licensing should be accidents per driven mile. Google's cars already beat humans by that metric. Granted, Google has limited its autonomous cars to relatively benign environments, and I suspect the licensing will restrict operation to proven operating envelopes. But, the rate of improvement of the technology will expand the envelope rapidly over the next decade. I think the outcome of the experiment will eventually be humans losing their right to drive, hopefully after losing their desire to.

    Let be correct here, google said none of their cars "caused" an accident and they held this position for a long time, until someone challenged them on this. They finally admitted the cars were in fact in accidents, they were just the victim of the accident, meaning someone else hit them and the poor person who was required to drive along was helpless to avoid getting hit.
    edited February 2016
  • Reply 24 of 74
    maestro64 said:
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.


    Interest point I did not think about, Insurance insures the car and driver today, if you are not driving you are not required to have insurance. Also GM and other car companies have jumped on the DRM bandwagon and claiming that you do not own the software which operates the car as such you are not allowed to modify the software, the softeware and control systems in the car is only licensed to you for use not ownership. If you do not own the control systems then legally you are not required to insure it. Since you are not driving the car you are not liable for the its actions since you can bet Apple and Google will not allow you to own the control system than you should not have to insure tha car for liability to hitting someone else. Now if the car is damage at no faul of you or the control system you would have to insure for that situation especially if you still own money on the car.

    Since the NHTSA is saying the car is a legal person, just like a corporation is consider a legal person, this means you can sue your car, or the one who controls the car.

    Going back to my question which no one seems to be willing to answer, will the car kill you to save bambi, if so who pays

    It COULD be that each car will essentially be a separate LLC (Limited Liability Company) owned by (Google, for example, or Apple). Therefore, the LLC would be responsible for legal matters and should the LLC be attacked as liable for damages or death, that's the entity that would be responsible in court and beholden to their insurance company.

    If few people actually OWN the cars in the future (and this is a likely scenario for younger people who have little interest in owning cars) and instead "hire" them a la Uber, then the states will no longer have driver insurance, licensing or annual fees to look forward to. Any associated fees would be coming from the LLCs at far lower rates. The states will (thankfully) have one fewer source of tax revenue to extract from their citizens, however they will likely apply other new taxes on people to make up some of the difference... possibly a "road usage fee" which would accrue based on the number of miles traveled.

    I can also see the possibility of an increase in electric scooters or "moped"-like transportation, which does not require a license to own or operate.
    edited February 2016
  • Reply 25 of 74
    maestro64maestro64 Posts: 5,043member
    jony0 said:
    maestro64 said:
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.

    Going back to my question which no one seems to be willing to answer, will the car kill you to save bambi, if so who pays

    I trust that the AI manufacturers and the NHTSA will go over a lot of case scenarios and testing before certifying every single AI car design. To answer your question in particular I would imagine that the obstacle avoidance algorithms would have to and will be able to detect whether it is human or otherwise by assessing size and speed and general physical attributes, assess if the other lane is oncoming traffic and free to go around the obstacle while immediately applying all braking schemes available and avoid only if it's safe to do so. As for Bambi specifically, I live in deer country and am always mindful of my dad's early advice to slam the brakes immediately, brace yourself for impact and stay in your lane unless absolutely sure, there have been too many stories here of people going in the ditch or in oncoming traffic and killing themselves to avoid wildlife. I would certainly hope this would also be the chosen algorithm.


    Yeah you would hope, unless it is a person who jump out verse an animal. Then it kills you verse the stupid person who jump out in front of your car. The problem is there are so many scenarios they can not cover them all. Like their is a person on the road crossing illegally and on coming traffic the other way, and innocent pedestrian standing to your right, who gets killed.  Keep in mind, we humand have a ability to forgive someone for an unavoidable accident and they do happen, but we humand do not extend that forgiveness to objects and less so to companies of those objects. This is the paradigm we are dealing with and designers of these kinds of systems fail to understand, they just think it neat to allow a car drive itself. Also remember these systems were developed by the miliary and they are in the business of killing people so they did not worry about collateral damage, they did not want US soilder being killed.


    BTW, i was not looking for someone like you to answer I was looking for Google and other companies involved in this technolog to tell me who dies in these cases. Unless Google can tell me who dies I am interested in putting my faith in them to get it right in all cases and guess what, I bet I would change my mind on what is acceptable as time goes on.

    edited February 2016
  • Reply 26 of 74
    maestro64 said:
    volcan said:
    In most situations self driving cars would be far safer, primarily because they will not break the law. Human drivers can be pretty dangerous. Speeding, illegal turns, texting and cellphone usage while driving, drunk driving, tail gating, etc, etc. Personally, I obey all traffic laws. Obnoxious people driving illegally and irresponsibly are really annoying. I would be willing to give up my enjoyment of driving a fine automobile just to get rid of all the reckless drivers out there, so bring on the self driving cars.

    Think about these self driving cars interacting with the these unpreditable human, who do you think will win. I'll give you an example in most DUI accidents who lives and dies most time. Computers today still do not have the ability to deal with the unpreditable. Humans have the ability to respond and react to things they not dealt with in the past. I personally avoided bad situations when I realize others around me were not driving the way they should, it is suddle things but it adds up, the self driving cars lack this ability.
    you underestimate the AI.  Computers will react far faster than a person is capable of and the software running them has a more driving experience than any single person does.
  • Reply 27 of 74
    ACNACN Posts: 1member
    Betcha a dollar that kpluck speeds, tailgates, and insists he's the best driver on the road. Sounds like he's one of that group of entitled people who feels the rules don't apply to them because they're "too smart." Dunning-Kruger Effect.
    volcan said:
    kpluck said:

    I fully support you not driving. Based on what you said, you haven't a clue what really makes a driver dangerous.

    -kpluck
    Please do explain what REALLY makes a driver dangerous.



  • Reply 28 of 74

    maestro64 said:
    bsimpsen said:
    In 2010, more than 30,000 died  and 2.24 million were injured in 5.4 million auto accidents in the US.

    (https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year)

    So, if the threshold for licensing is "better than a human", your estimate is off by six decimal places.

    A reasonable metric for licensing should be accidents per driven mile. Google's cars already beat humans by that metric. Granted, Google has limited its autonomous cars to relatively benign environments, and I suspect the licensing will restrict operation to proven operating envelopes. But, the rate of improvement of the technology will expand the envelope rapidly over the next decade. I think the outcome of the experiment will eventually be humans losing their right to drive, hopefully after losing their desire to.

    Let be correct here, google said none of their cars "caused" an accident and they held this position for a long time, until someone challenged them on this. They finally admitted the cars were in fact in accidents, they were just the victim of the accident, meaning someone else hit them and the poor person who was required to drive along was helpless to avoid getting hit.
    The google cars are better than a person at avoiding being hit.  If they still get hit occasionally, it's only because there is a person in the other car, who is the one liable for the accident.  If the car was self driving, they wouldn't have gotten hit.  People aren't physically capable of activity monitoring the full 360 degree environment they are in and correctly responding to an emergency while correctly accounting for everything in every direction the way computers can.
  • Reply 29 of 74
    maestro64maestro64 Posts: 5,043member
    linkman said:
    Right now if a human is driving in the USA and breaks a law there is criminal liability. An individual is held responsible. Extend this concept to AI driving and either one of the riders is now responsible or someone who owns/operates/designed the software/engineered it/etc. can go to jail. I think that someone should definitely be held responsible for poorly implemented vehicles -- I don't want to get caught behind 3 mile traffic jams everyday because some stupid AI operated cars are being too cautious and driving slower than the minimum allowed speed.

    This basically happened a few months ago in California: http://www.theguardian.com/technology/2015/nov/13/google-self-driving-car-pulled-over-driving-too-slowly
    This what I loved about living in CA, it was the only state I have driving in which will ticket you for not driving fast enough. When I lived there you could be 10 or 15 over and the police would not give you a second look as long as traffic was flowing. I use to see people get pulled over for lane changing, tailgating or going too slow. they felt those was far more dangerous than driving over the speed limit. Will these cars know when faster is acceptable and slower is danergous.
  • Reply 30 of 74
    linkman said:
    Right now if a human is driving in the USA and breaks a law there is criminal liability. An individual is held responsible. Extend this concept to AI driving and either one of the riders is now responsible or someone who owns/operates/designed the software/engineered it/etc. can go to jail. I think that someone should definitely be held responsible for poorly implemented vehicles -- I don't want to get caught behind 3 mile traffic jams everyday because some stupid AI operated cars are being too cautious and driving slower than the minimum allowed speed.

    This basically happened a few months ago in California: http://www.theguardian.com/technology/2015/nov/13/google-self-driving-car-pulled-over-driving-too-slowly
    Presumably, once autonomous systems are road-proven there will be no more need for speed limits because the onboard AI will have a sufficient amount of information to account for 99.99% of the potential for accidents at any given moment and that will be a part of the calculus to determine a safe driving speed. No one person will be able to cause an accident because individual choice on a "public" road will have been eliminated. This may come to pass in 20-30 years.
    edited February 2016
  • Reply 31 of 74
    maestro64maestro64 Posts: 5,043member
    bsimpsen said:
    No thanks. I'd rather drive my own car and enjoy the experience. People invent newer and newer technology which always appears to phase out the need for anything human. Who needs people at all eventually when you can have a computer do it for you? Such a developer mindset.
    If humans are more dangerous behind the wheel than computers, the insurance companies will know it, and adjust their rates accordingly. The cost for you to remain behind the wheel will rise until you give up. It's got nothing to do with a developer's mindset. It's pure economics.
    That statement is probably closer to the truth than not. It will be like Obama care they will make it more costly for you not to do what they want. This is how our government controls people, they do not need guns they just make it too expensive for you to do what you want. Even when the governement is wrong it cost you far more than them to prove you were right.

    SpamSandwichtallest skil
  • Reply 32 of 74
    linkman said:
    Right now if a human is driving in the USA and breaks a law there is criminal liability. An individual is held responsible. Extend this concept to AI driving and either one of the riders is now responsible or someone who owns/operates/designed the software/engineered it/etc. can go to jail. I think that someone should definitely be held responsible for poorly implemented vehicles -- I don't want to get caught behind 3 mile traffic jams everyday because some stupid AI operated cars are being too cautious and driving slower than the minimum allowed speed.

    This basically happened a few months ago in California: http://www.theguardian.com/technology/2015/nov/13/google-self-driving-car-pulled-over-driving-too-slowly

    many traffic jams are caused by people's poor reactions when they over break.  There are roads I know will have major traffic jams even though there is no reason for any car to ever stop other than countless drivers over breaking.  If these same roads had self driving cars instead of cars driven by people, the traffic jams would go away.  Other traffic jams are caused by accidents that self driving cars would avoid.

    Also, self driving cars would let the person who would have been driving do something else with their time.  Watch a show, do some work, play a game, talk to their family, and even sleep.  That has the potential to free up a massive amount of time and change the way people travel. 

    edited February 2016
  • Reply 33 of 74
    maestro64 said:
    bsimpsen said:
    If humans are more dangerous behind the wheel than computers, the insurance companies will know it, and adjust their rates accordingly. The cost for you to remain behind the wheel will rise until you give up. It's got nothing to do with a developer's mindset. It's pure economics.
    That statement is probably closer to the truth than not. It will be like Obama care they will make it more costly for you not to do what they want. This is how our government controls people, they do not need guns they just make it too expensive for you to do what you want. Even when the governement is wrong it cost you far more than them to prove you were right.


    He wasn't referring to the government, he was referring to insurance companies. They already discount cars and drivers that are proven to be safer, self driving cars will be the safest of all.  They don't fall asleep, they don't get distracted, they don't get drunk, they don't cut people off, they don't miss their blind spots, they don't tailgate, they don't panic, they don't get impatient and take risks.
  • Reply 34 of 74
    bsimpsen said:
    In 2010, more than 30,000 died  and 2.24 million were injured in 5.4 million auto accidents in the US.

    So, if the threshold for licensing is "better than a human", your estimate is off by six decimal places.
    Except a human doesn’t have to kill 30,000 people or crash 5.4 million times to have his license revoked.

    If the AI is being treated as a driver, then it is being treated as a driver. Meaning legally. Meaning subject to the same stipulations as regular drivers. Meaning that when it, COLLECTIVELY–because each individual instance of the software is not magically its own “person”, because we’re already walking down the road to mentally defective psychopathic genocidal bullshit–has the same number of accidents as any given individual driver, it will have its license to drive revoked, and thus be inaccessible on any other remaining vehicles.

    Otherwise it isn’t being treated as a driver and would not have the same legal status.

    bsimpsen said:
    The cost for you to remain behind the wheel will rise until you give up.
    So I’ll drive without it. They can’t stop me. It’s not like it wouldn’t be safe, what with “everyone else” going to the “perfectly safe” automated system, right?
    edited February 2016
  • Reply 35 of 74
    alandail said:
    maestro64 said:
    That statement is probably closer to the truth than not. It will be like Obama care they will make it more costly for you not to do what they want. This is how our government controls people, they do not need guns they just make it too expensive for you to do what you want. Even when the governement is wrong it cost you far more than them to prove you were right.


    He wasn't referring to the government, he was referring to insurance companies. They already discount cars and drivers that are proven to be safer, self driving cars will be the safest of all.  They don't fall asleep, they don't get distracted, they don't get drunk, they don't cut people off, they don't miss their blind spots, they don't tailgate, they don't panic, they don't get impatient and take risks.
    He was, however the insurance industry is HIGHLY regulated which means that at many, if not most levels, government IS determining what insurance companies do, but the insurance companies also have very large lobbying arms in government, plus people who work in government get hired by industry and people who work in this industry get hired into lawmaking positions. These things are a hallmark of corporatism, sometimes interpreted as being equal to fascism.
    edited February 2016 tallest skil
  • Reply 36 of 74
    thrangthrang Posts: 1,011member
    Who gets sued if the driverless car causes an accident or otherwise injuries someone? The corporation that made it? The third party that bought it?
  • Reply 37 of 74
    jony0jony0 Posts: 378member
    maestro64 said:
    volcan said:
    In most situations self driving cars would be far safer, primarily because they will not break the law. Human drivers can be pretty dangerous. Speeding, illegal turns, texting and cellphone usage while driving, drunk driving, tail gating, etc, etc. Personally, I obey all traffic laws. Obnoxious people driving illegally and irresponsibly are really annoying. I would be willing to give up my enjoyment of driving a fine automobile just to get rid of all the reckless drivers out there, so bring on the self driving cars.

    Think about these self driving cars interacting with the these unpreditable human, who do you think will win. I'll give you an example in most DUI accidents who lives and dies most time. Computers today still do not have the ability to deal with the unpreditable. Humans have the ability to respond and react to things they not dealt with in the past. I personally avoided bad situations when I realize others around me were not driving the way they should, it is suddle things but it adds up, the self driving cars lack this ability.

    I believe it’s been well established that the main reason drunks’ survival rate is higher is because they don’t react at the onset of the impact and remain quite lose-limbed, thus ironically minimizing the fractures that their panic stricken tensed up victims will sustain, including fatal ones involving breaking into soft tissues.

    As for the human’s predilection for unpredictability, that is the very liability these systems are meant to ultimately eliminate. In the meantime the algorithms will have to include many contingencies of course, however the very advantage of computers is to be able to analyze a tremendous amount of data, quickly simulate different actions and possible outcomes without any stress. This is how and why they can now beat humans handily at chess because they’ve had time to see an incredible amount of possibilities, assign weighted values to outcomes and choose the best scenario, then quickly and calmly execute, far outclassing any human responses. I agree that while they may not always pick up some of the subtle behaviours that you or I might see in advance, they will undoubtedly act upon the threat, when it happens, a lot faster than you or I will without panicking and trying maneuvers that the car simply can not accomplish in time, if at all. Also keep in mind that the trucks that are now allowed on highways in 2 states can communicate amongst each other, at least the Mercedes ones, so that when the lead truck puts on the brakes, the trailing trucks will react within milliseconds. I would expect that this will be a universal standard for all AI and should eliminate most of the horrific multi-vehicle pile-ups, another human frailty.

    I've always kept in mind the series of Public Service Announcements decades ago that depicted a few bonehead maneuvers and accompanying possible counters, with a voiceover and slogan preaching : “Watch out for the other guy”. I expect that will also be one of the main mantras guiding the AI teams.

  • Reply 38 of 74
    thrang said:
    Who gets sued if the driverless car causes an accident or otherwise injuries someone? The corporation that made it? The third party that bought it?
    As I wrote, it may be that each car will be a separate LLC, which would limit liability to the individual car (which sounds strange, but could be entirely workable since the law will now recognize the vehicle as a legal "person").
    edited February 2016
  • Reply 39 of 74
    jfc1138jfc1138 Posts: 3,090member
    JinTech said:
    Does this mean that a drivers license is not required to operate these vehicles?
    It has zero to do with that. This was about federal design guidelines/standards for safety. Position of brake pedals in relation to where "the driver" would be seated, position of the rear view mirror in relation to where "the driver" would be seated, etc. all those places in the design standards where "the driver" is referenced for position or accessibility purposes. In the case where "the driver" is a set of chips this guidelines/standards had to be modified or officially interpreted in light of the new type of "driver".

    Other articles on this were far clearer about the purpose.
    "Google wanted to know, for example, if its autonomous vehicles had to abide by a rule that requires vehicles to have an “occupant seat for the driver”.

    The government agency responded that, because it interprets the term “driver” as the self-driving system, “the ‘driver’ in this provision would not need an occupant seat”."..."Although the NHTSA said it agreed the software was the driver in a Google car, it also said it had no test to evaluate whether the software was a good one. It also said the company would have to work around federal rules requiring cars to have basic safety features – like brake pedals. This could be done by having rules changed or by Google petitioning for an exemption, the agency said.

    “Those standards were drafted at a time when it was reasonable to assume that all motor vehicles would have a steering wheel, accelerator pedal, and brake pedal, almost always located at the front left seating position, and that all vehicles would be operated by a human driver,” the government said."

    http://www.theguardian.com/technology/2016/feb/09/google-computers-self-driving-cars-human

    edited February 2016
  • Reply 40 of 74
    volcanvolcan Posts: 1,799member
    thrang said:
    Who gets sued if the driverless car causes an accident or otherwise injuries someone? The corporation that made it? The third party that bought it?
    In the beginning probably no one will be sued except perhaps the insurance company. In order to further the perfection of self driving systems the manufactures will need to have indemnity because otherwise the first time there is an accident the injury attorneys will sue the manufacturer to the point that they will no longer be able to continue making self driving cars so the technology will never be developed. 
Sign In or Register to comment.