Apple's self-driving cars disengage from autonomous mode about once per mile, highest rate...

13»

Comments

  • Reply 41 of 51
    MacPro said:
    roake said:
    Very conservative parameters would also result in frequent disengagements.
    Exactly, I was just going to post 'Or the safest system so far' but you said it another way.  It just could be the other systems take way too many risks and trust the AI too much at this stage of the game.  Apple's AI is still learning and I'd guess will end up the safest (and not share your private info ... haha).
    Even the opposite could be true - we just don't have the data. One of the criticisms of Waymo is that their cars are too cautious, e.g. turning left across traffic takes them forever but it appears that when they do finally make that turn, they are ready to do it safely. It's easy to imagine an Uber car doing it quicker but asking for the driver to take over halfway through the turn. Not sure which style Apple has.
  • Reply 42 of 51
    lkrupp said:
    Why was Tesla's data omitted?
    Because Tesla’s auto-pilot is NOT an autonomous system. 
    And they have zero cars registered with this California program that requires reporting.  https://thelastdriverlicenseholder.com/2019/02/12/disengagement-reports-2018-preliminary-results/
  • Reply 43 of 51
    gatorguygatorguy Posts: 24,176member
    mike1 said:
    First question I would ask is what are the criteria for disengagement? I have no idea what they are, but if Apple's criteria were less forgiving, then that could be a sign of them trying to reach a higher level.
    disengagement according to the California DMV is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.

    When the vehicles autonomous driving program is turned off (disengaged) either by the program itself or by the driver the action has to be  documented in detail: Weather and traffic conditions, the action the car was taking at the time, roadbed/surface condition, the street and type of neighborhood, how long it took the driver to react etc. Every approved company must follow the same mandatory reporting requirements. They can't just say, whoops it woulda been OK afterall so we just won't report we disengaged the autonomy program on this one. 

    Refer to this vehicle code, starting about Page11.
    edited February 2019
  • Reply 44 of 51
    volcanvolcan Posts: 1,799member
    gatorguy said:
    disengagement is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.
    Exactly! If the other licensed testing companies were discovered to not be reporting accurately, they would risk losing their license from the DMV. But I think you can always report more disengagements when considering those that were the result of a safety driver overrule but you absolutely can't report fewer autonomous disengagements without breaking the rules.
  • Reply 45 of 51
    Why was Tesla's data omitted?
    Apparently they do have a licence but haven't done any testing of L4 cars in CA in the last couple of years
  • Reply 46 of 51
    gatorguygatorguy Posts: 24,176member
    volcan said:
    gatorguy said:
    disengagement is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.
    Exactly! If the other licensed testing companies were discovered to not be reporting accurately, they would risk losing their license from the DMV. But I think you can always report more disengagements when considering those that were the result of a safety driver overrule but you absolutely can't report fewer autonomous disengagements without breaking the rules.
    It doesn't matter whether the program is exited by the vehicle itself or by the safety driver. EVERY time the autonomous system is disengaged for any reason it is mandatory there be a report on the circumstances and explanation for it.
  • Reply 47 of 51
    volcanvolcan Posts: 1,799member
    gatorguy said:
    volcan said:
    gatorguy said:
    disengagement is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.
    Exactly! If the other licensed testing companies were discovered to not be reporting accurately, they would risk losing their license from the DMV. But I think you can always report more disengagements when considering those that were the result of a safety driver overrule but you absolutely can't report fewer autonomous disengagements without breaking the rules.
    It doesn't matter whether the program is exited by the vehicle itself or by the safety driver. EVERY time the autonomous system is disengaged for any reason it is mandatory there be a report on the circumstances and explanation for it.
    I agree, my point was to refute the claims of some posters here that some other companies have more relaxed standards. The autonomous car doesn't always know it is misbehaving especially in non-critical conditions and thus may be overlooked by the safety tester. While it may result in fewer disengagements reported it could be a function of Apple's stricter criteria but under no circumstances can automatic disengagements be under reported.
  • Reply 48 of 51
    You nailed it! AIs are NOT created equal. Some are much better than others. When I own an autonomous vehicle, I want Waymo to drive it, not Apple and certainly not Uber.
  • Reply 49 of 51
    gatorguy said:
    volcan said:
    gatorguy said:
    disengagement is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.
    Exactly! If the other licensed testing companies were discovered to not be reporting accurately, they would risk losing their license from the DMV. But I think you can always report more disengagements when considering those that were the result of a safety driver overrule but you absolutely can't report fewer autonomous disengagements without breaking the rules.
    It doesn't matter whether the program is exited by the vehicle itself or by the safety driver. EVERY time the autonomous system is disengaged for any reason it is mandatory there be a report on the circumstances and explanation for it.
    Nope.  That's not what the regulation says.  It doesn't say "every time it's disengaged for any reason."   It specifically says when disengaged because of a "failure" or because safety "required" the human driver to take control "immediately."  There is a a huge grey area that I expect other companies are taking advantage of.  Whereas Apple is reporting (as clarified by the new article here on AI) that they are logging every time the human takes control for any reason--and their system currently is very conservative about handing off control.  I'm sure Apple's systems aren't the most mature and fault tolerant, but I doubt that they are hundreds or thousands of times worse than every other company's offering (as the data would suggest if you're correct).
    edited February 2019
  • Reply 50 of 51
    volcan said:
    gatorguy said:
    volcan said:
    gatorguy said:
    disengagement is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.
    Exactly! If the other licensed testing companies were discovered to not be reporting accurately, they would risk losing their license from the DMV. But I think you can always report more disengagements when considering those that were the result of a safety driver overrule but you absolutely can't report fewer autonomous disengagements without breaking the rules.
    It doesn't matter whether the program is exited by the vehicle itself or by the safety driver. EVERY time the autonomous system is disengaged for any reason it is mandatory there be a report on the circumstances and explanation for it.
    I agree, my point was to refute the claims of some posters here that some other companies have more relaxed standards. The autonomous car doesn't always know it is misbehaving especially in non-critical conditions and thus may be overlooked by the safety tester. While it may result in fewer disengagements reported it could be a function of Apple's stricter criteria but under no circumstances can automatic disengagements be under reported.
    Reporting is a manual process filling out a PDF.  Of course there are "circumstances" where automatic disengagements are underreported.  California isn't parsing raw data being generated by the autonomous systems.  Compliance is a very discretionary process--as with all compliance processes.
  • Reply 51 of 51
    volcanvolcan Posts: 1,799member
    volcan said:
    gatorguy said:
    volcan said:
    gatorguy said:
    disengagement is a situation when an autonomous vehicle gets confused and has to hand over control of the car to a safety driver, or when the safety driver overrules the car. Simple. A disengagement for reporting purposes is the same for everyone. Apple doesn't have "special rules" and reporting differently than other licensed and regulated vehicle testers.
    Exactly! If the other licensed testing companies were discovered to not be reporting accurately, they would risk losing their license from the DMV. But I think you can always report more disengagements when considering those that were the result of a safety driver overrule but you absolutely can't report fewer autonomous disengagements without breaking the rules.
    It doesn't matter whether the program is exited by the vehicle itself or by the safety driver. EVERY time the autonomous system is disengaged for any reason it is mandatory there be a report on the circumstances and explanation for it.
    I agree, my point was to refute the claims of some posters here that some other companies have more relaxed standards. The autonomous car doesn't always know it is misbehaving especially in non-critical conditions and thus may be overlooked by the safety tester. While it may result in fewer disengagements reported it could be a function of Apple's stricter criteria but under no circumstances can automatic disengagements be under reported.
    Reporting is a manual process filling out a PDF.  Of course there are "circumstances" where automatic disengagements are underreported.  California isn't parsing raw data being generated by the autonomous systems.  Compliance is a very discretionary process--as with all compliance processes.
    I don’t know anything about compliance with autonomous car testing but being employed in the medical manufacturing industry we have lots of compliance licenses and they are all audited regularly. If your documentation throws up a red flag they will dig deeper.
Sign In or Register to comment.