AI 'drivers' will gain same legal status as human drivers in autonomous vehicles, NHTSA rules

Posted:
in Future Apple Hardware
The artificial intelligence in a self-driving vehicle can be treated as the "driver" from a legal standpoint, the U.S. National Highway Traffic Safety Administration said in a decision that could set an important precedent for Apple's own self-driving car.




The position was announced in a letter to Google, published to the NHTSA's website, Reuters reported on Wednesday. The missive was written in response to a design submitted by Google on Nov. 12, which specified a vehicle that has "no need for a human driver."

The chief counsel for the NHTSA informed Google that the agency "will interpret 'driver' in the context of Google's described motor vehicle design as referring to the (self-driving system), and not to any of the vehicle occupants." Current vehicle regulations are based on the assumption of human control, which has created problems in establishing a legal framework for self-driving technology.

In January, U.S. Transportation Secretary Anthony Foxx announced that the federal government would be bending its interpretation of the rules in some cases to accelerate self-driving car development. "Best practices" guidelines should be established in the first half of the year, and the NHTSA will exempt up to 2,500 vehicles from safety standards for the sake of testing.

Although officially still under wraps, Apple is widely believed to be working on an electric car for launch in 2019 or 2020. While the first model may potentially lack self-driving systems, the company is at least thought to be working on the concept for subsequent vehicles.
«134

Comments

  • Reply 1 of 74
    That's the Federal government for you. They pass laws they then determine are inconvenient, then don't rescind them.
  • Reply 2 of 74
    JinTechJinTech Posts: 1,022member
    Does this mean that a drivers license is not required to operate these vehicles?
  • Reply 3 of 74
    jfc1138jfc1138 Posts: 3,090member
    My understanding is that this referred to design standards where "the driver" is specified for things like rear view mirror height, angle and position etc., accessibility to the brake pedals and that sort of thing. A cpu's physical position doesn't link to those sorts of statutory design requirements.
    edited February 2016
  • Reply 4 of 74
    JinTech said:
    Does this mean that a drivers license is not required to operate these vehicles?
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.
    edited February 2016
  • Reply 5 of 74
    zimmiezimmie Posts: 651member
    JinTech said:
    Does this mean that a drivers license is not required to operate these vehicles?
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.
    More interesting to me is the question of whether you would need insurance listing you as the driver, or the car. It's entirely plausible that a self-driving car could talk to insurance companies and refuse to engage the self-driving mode unless it had a valid insurance checkin for the day or the month or whatever. That deals with the issue of manufacturer liability rather nicely from the manufacturer's point of view. The problem is that various areas don't require insurance. Private property is the one most people think about, but countries where cars are still rare often don't have much of a legal framework for their use outside of cities. It also doesn't address the potential for a bond in lieu of insurance in regions where that is allowed.

    There's also the question of how such insurance checkin would work. I can't imagine any two insurance companies agreeing on a protocol. They're worse than banks, and we've seen the nonsense involved in dealing with banks in the ApplePay rollout.
    edited February 2016
  • Reply 6 of 74
    melgrossmelgross Posts: 33,510member
    JinTech said:
    Does this mean that a drivers license is not required to operate these vehicles?
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.
    This is very confusing. I'd need to read the whole thing to get a better idea of what they mean here. But, if I own the car, and I'm not responsible for an accident, then who is? We can't take the car to court. We can't fine it. Can we take the license away? What license, and how would that work, as every car from that manufacturer would presumably have the same software and hardware combo. So would all the cars need to get off the road? This is a real can of worms. Does the manufacturer have to pay for the accident? What about insurance? Normally we incurr penalties if we have an accident. How would that work here?

    it seems to me that a lot needs to be worked out.
    edited February 2016
  • Reply 7 of 74
    maestro64maestro64 Posts: 5,043member
    JinTech said:
    Does this mean that a drivers license is not required to operate these vehicles?
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.


    Interest point I did not think about, Insurance insures the car and driver today, if you are not driving you are not required to have insurance. Also GM and other car companies have jumped on the DRM bandwagon and claiming that you do not own the software which operates the car as such you are not allowed to modify the software, the softeware and control systems in the car is only licensed to you for use not ownership. If you do not own the control systems then legally you are not required to insure it. Since you are not driving the car you are not liable for the its actions since you can bet Apple and Google will not allow you to own the control system than you should not have to insure tha car for liability to hitting someone else. Now if the car is damage at no faul of you or the control system you would have to insure for that situation especially if you still own money on the car.

    Since the NHTSA is saying the car is a legal person, just like a corporation is consider a legal person, this means you can sue your car, or the one who controls the car.

    Going back to my question which no one seems to be willing to answer, will the car kill you to save bambi, if so who pays

    edited February 2016
  • Reply 8 of 74
    volcanvolcan Posts: 1,799member
    In most situations self driving cars would be far safer, primarily because they will not break the law. Human drivers can be pretty dangerous. Speeding, illegal turns, texting and cellphone usage while driving, drunk driving, tail gating, etc, etc. Personally, I obey all traffic laws. Obnoxious people driving illegally and irresponsibly are really annoying. I would be willing to give up my enjoyment of driving a fine automobile just to get rid of all the reckless drivers out there, so bring on the self driving cars.
    lollivermr o
  • Reply 9 of 74
    lkrupplkrupp Posts: 10,557member
    I can see this technology being used by the livery industry to eliminate the cost of human drivers. This might work with delivery services. The pizza car drives up to your house, you swipe your credit card or NFC device, a door opens you take out your order. Amazon delivery might work the same way IF there’s somebody around to accept it.
  • Reply 10 of 74
    So when Google’s system is responsible for, say, five or more crashes, total, will the AI have its license revoked?

    Or will it just continue to be allowed to endanger people and remain in vehicles?

    I’ll bet the latter.
    linkman
  • Reply 11 of 74
    kpluckkpluck Posts: 500member
    volcan said:
    In most situations self driving cars would be far safer, primarily because they will not break the law. Human drivers can be pretty dangerous. Speeding, illegal turns, texting and cellphone usage while driving, drunk driving, tail gating, etc, etc. Personally, I obey all traffic laws. Obnoxious people driving illegally and irresponsibly are really annoying. I would be willing to give up my enjoyment of driving a fine automobile just to get rid of all the reckless drivers out there, so bring on the self driving cars.
    I fully support you not driving. Based on what you said, you haven't a clue what really makes a driver dangerous.

    -kpluck
  • Reply 12 of 74
    So if I'm in an accident, the manufacturer is automatically guilty.  

    This is the equivalent to Boeing being responsible to a crash caused by the autopilot error rather than the Pilot in Command.
  • Reply 13 of 74
    Ha ha - I want to see if an autonomous auto can pass a California or British drivers license test!

    And I'm also dying to know what companies will be offering insurance on them, and at what cost
  • Reply 14 of 74
    So when Google’s system is responsible for, say, five or more crashes, total, will the AI have its license revoked?

    Or will it just continue to be allowed to endanger people and remain in vehicles?

    I’ll bet the latter.
    In 2010, more than 30,000 died  and 2.24 million were injured in 5.4 million auto accidents in the US.

    (https://en.wikipedia.org/wiki/List_of_motor_vehicle_deaths_in_U.S._by_year)

    So, if the threshold for licensing is "better than a human", your estimate is off by six decimal places.

    A reasonable metric for licensing should be accidents per driven mile. Google's cars already beat humans by that metric. Granted, Google has limited its autonomous cars to relatively benign environments, and I suspect the licensing will restrict operation to proven operating envelopes. But, the rate of improvement of the technology will expand the envelope rapidly over the next decade. I think the outcome of the experiment will eventually be humans losing their right to drive, hopefully after losing their desire to.
    lolliver
  • Reply 15 of 74
    No thanks. I'd rather drive my own car and enjoy the experience. People invent newer and newer technology which always appears to phase out the need for anything human. Who needs people at all eventually when you can have a computer do it for you? Such a developer mindset.
    tallest skil
  • Reply 16 of 74
    No thanks. I'd rather drive my own car and enjoy the experience. People invent newer and newer technology which always appears to phase out the need for anything human. Who needs people at all eventually when you can have a computer do it for you? Such a developer mindset.
    If humans are more dangerous behind the wheel than computers, the insurance companies will know it, and adjust their rates accordingly. The cost for you to remain behind the wheel will rise until you give up. It's got nothing to do with a developer's mindset. It's pure economics.
    lolliver
  • Reply 17 of 74
    linkmanlinkman Posts: 1,035member
    Right now if a human is driving in the USA and breaks a law there is criminal liability. An individual is held responsible. Extend this concept to AI driving and either one of the riders is now responsible or someone who owns/operates/designed the software/engineered it/etc. can go to jail. I think that someone should definitely be held responsible for poorly implemented vehicles -- I don't want to get caught behind 3 mile traffic jams everyday because some stupid AI operated cars are being too cautious and driving slower than the minimum allowed speed.

    This basically happened a few months ago in California: http://www.theguardian.com/technology/2015/nov/13/google-self-driving-car-pulled-over-driving-too-slowly
  • Reply 18 of 74
    jony0jony0 Posts: 378member
    maestro64 said:
    If you were a non-vehicle operating passenger in an autonomous vehicle that you did not own, it'd be no different from you taking a cab.

    If you owned the vehicle, presumably you'd still need insurance.

    Going back to my question which no one seems to be willing to answer, will the car kill you to save bambi, if so who pays

    I trust that the AI manufacturers and the NHTSA will go over a lot of case scenarios and testing before certifying every single AI car design. To answer your question in particular I would imagine that the obstacle avoidance algorithms would have to and will be able to detect whether it is human or otherwise by assessing size and speed and general physical attributes, assess if the other lane is oncoming traffic and free to go around the obstacle while immediately applying all braking schemes available and avoid only if it's safe to do so. As for Bambi specifically, I live in deer country and am always mindful of my dad's early advice to slam the brakes immediately, brace yourself for impact and stay in your lane unless absolutely sure, there have been too many stories here of people going in the ditch or in oncoming traffic and killing themselves to avoid wildlife. I would certainly hope this would also be the chosen algorithm.
    lolliver
  • Reply 19 of 74
    volcanvolcan Posts: 1,799member
    kpluck said:

    I fully support you not driving. Based on what you said, you haven't a clue what really makes a driver dangerous.

    -kpluck
    Please do explain what REALLY makes a driver dangerous.
    lolliver
  • Reply 20 of 74
    tenlytenly Posts: 710member
    melgross said:
    What license, and how would that work, as every car from that manufacturer would presumably have the same software and hardware combo. So would all the cars need to get off the road?
    I think that saying that every car from that manufacturer is the same is an over-simplification not exactly the same but close enough as saying that all humans are the same because we are all equipped with the same sensors - ie hands, eyes, ears, etc...  With the cars - the brains might be identical - but might not if there are different versions of firmware available - and as far as the other sensors are concerned - it's possible that they would be different models, or have different firmware, tolerances and wear or state of disrepair (maybe some mud partially obstructing an optical sensor - or a piece of metal debris affecting another sensor).  In any case - sensors placement would be different on the different model cars from a manufacturer - but even within a specific year and model, I don't think they would all be able to be considered "exactly" the same.

    Of course, having said all that - I don't disagree at all with your conclusion that "a lot needs to be worked out!"
Sign In or Register to comment.