Apple's self-driving cars disengage from autonomous mode about once per mile, highest rate...

2

Comments

  • Reply 21 of 51
    “From December 2018 through November 2019?”  Have we gotten so technically advanced that we can predict future statistics or did I sleep through another year?  I hate when that happens. 
  • Reply 22 of 51
    This is a story that lacks a substantial amount of background information. It only asks companies to self-report the number of disengagements. There is no information on how that data is audited/verified by 3rd parties etc.
    edited February 2019
  • Reply 23 of 51
    metrixmetrix Posts: 256member
    If I didn't want other companies poaching my engineers this is exactly what I would report, our team is terrible. 
    randominternetperson
  • Reply 24 of 51
    LatkoLatko Posts: 398member
    It's tenuous to draw conclusions on best/worst performance from disengagement figures - since there is no standard for when to disengage and companies like Apple (and some of the other brands with high disengagements) are going to be much more protective of their brand.

    That said I wouldn't expect apple to be leading the pack since other competitors have been working in this space for longer.

    But when reading disengagement figures, keep in mind the number and seriousness of incidents. Apple have had just two - both not at fault (including one where the AI wasn't at play.) Now contrast this with companies such as Uber, where the AI didn't detect a pedestrian, resulting in a fatality.


    The whole reason of Apple being in this arena (at the cost/investment of the Apple customer) is that (coming from an IT-perspective) they’d improve on the car industry, so car companies would be queueing to partner with them. Well, that’s is just a hilarious idea for several reasons, performance being just one of them
    edited February 2019
  • Reply 25 of 51
    hentaiboyhentaiboy Posts: 1,252member
    “On the other end of the spectrum was Google's Waymo, which managed an impressive 0.09 disengagements per 1,000 miles, or one disengagement every 11,154 miles.“

    That’s a seriously impressive stat if accurate.
    edited February 2019 muthuk_vanalingamrandominternetperson
  • Reply 26 of 51
    larryjwlarryjw Posts: 1,036member
    This is not a meaningful statistic.  It’s dependent on what phase of training the system is in and the training theory under which the data is being compiled. 
    radarthekatrandominternetperson
  • Reply 27 of 51
    Back to the future (November 2019?).
  • Reply 28 of 51
    jungmark said:
    Since we’re still a few years away from full automation, it’s much ado about nothing. 
    “full (or fool) automation” is indeed a few years away, a few year away ...
  • Reply 29 of 51
    gatorguygatorguy Posts: 24,622member
    MplsP said:
    Waiting for the slough of posts calling this story troll-bait, uninformed, etc...

    I have no idea how accurate this is, but given the fact that Apple is later to the self-driving car game than other players, it would not be surprising if it's true. They're just further back on the development curve.
    It's completely accurate, coming from mandatory reports filed with the DMV. 
  • Reply 30 of 51
    gatorguygatorguy Posts: 24,622member
    rcfa said:
    There are two questions to be asked:

    b) are there uniform standards for disengagement, or is Apple simply more careful during testing to avoid negative press and/or keep competition in the dark on their progress?
    Yes there are uniform standards set by the State of California, and the reporting in accordance with them is highly detailed. 
    https://www.dmv.ca.gov/portal/wcm/connect/d48f347b-8815-458e-9df2-5ded9f208e9e/adopted_txt.pdf?MOD=AJPERES
    https://www.dmv.ca.gov/portal/wcm/connect/0b712ccd-1e7e-41eb-84fc-f16af0ce6265/ol311r.pdf?MOD=AJPERES&CVID=
    edited February 2019 patchythepirate
  • Reply 31 of 51
    Apple official response: "You are all DRIVING it WRONG". Muh-hahahahaha. :joy: 
  • Reply 32 of 51
    mike1mike1 Posts: 3,433member
    First question I would ask is what are the criteria for disengagement? I have no idea what they are, but if Apple's criteria were less forgiving, then that could be a sign of them trying to reach a higher level.
  • Reply 33 of 51
    radarthekatradarthekat Posts: 3,901moderator
    k2kw said:
    jdw said:
    Apple can't even get Siri to work properly, so this news comes as no surprise at all.  It's very disheartening.
    When your Apple Car Crashes Siri will display a list of body shops ... 2000 miles away.
    I don’t love all your posts, but I do love this one.  Very humorous.  
    patchythepirate
  • Reply 34 of 51
    I love how the media idiotically conflates fewer disengagements with safer.  It just means the car is racking up miles and not learning anything.   For the fastest development of AI you actually need new events that it can learn from not repetitions of things it already knows how to do.
    randominternetpersonpatchythepirate
  • Reply 35 of 51
    MacProMacPro Posts: 19,822member
    roake said:
    Very conservative parameters would also result in frequent disengagements.
    Exactly, I was just going to post 'Or the safest system so far' but you said it another way.  It just could be the other systems take way too many risks and trust the AI too much at this stage of the game.  Apple's AI is still learning and I'd guess will end up the safest (and not share your private info ... haha).
    edited February 2019 lkrupp
  • Reply 36 of 51
    lkrupplkrupp Posts: 10,557member
    If I had made my decisions about investing in AAPL based on the constant, droning narrative of failure I read on AI and other Apple centric tech blogs I wouldn’t have invested at all. Thankfully I didn’t take any of that into consideration and now have increased the value of my portfolio because of the AAPL factor in it. It seems the real world is more like me. I wonder if any of the AI staff have AAPL in their portfolios and if so, why?
    randominternetperson
  • Reply 37 of 51
    lkrupplkrupp Posts: 10,557member
    Why was Tesla's data omitted?
    Because Tesla’s auto-pilot is NOT an autonomous system. 
  • Reply 38 of 51
    gatorguy said:
    rcfa said:
    There are two questions to be asked:

    b) are there uniform standards for disengagement, or is Apple simply more careful during testing to avoid negative press and/or keep competition in the dark on their progress?
    Yes there are uniform standards set by the State of California, and the reporting in accordance with them is highly detailed. 
    https://www.dmv.ca.gov/portal/wcm/connect/d48f347b-8815-458e-9df2-5ded9f208e9e/adopted_txt.pdf?MOD=AJPERES
    https://www.dmv.ca.gov/portal/wcm/connect/0b712ccd-1e7e-41eb-84fc-f16af0ce6265/ol311r.pdf?MOD=AJPERES&CVID=
    Thanks for sharing that @gatorguy.

    I was going to copy and paste the text of the relevant section (227.46), but I'm PDF challenged on my Windows' PC.  I'll try later on my Mac.

  • Reply 39 of 51
    gatorguygatorguy Posts: 24,622member
    MacPro said:
    roake said:
    Very conservative parameters would also result in frequent disengagements.
    Exactly, I was just going to post 'Or the safest system so far' but you said it another way.  It just could be the other systems take way too many risks and trust the AI too much at this stage of the game.  Apple's AI is still learning and I'd guess will end up the safest (and not share your private info ... haha).
    Nope. The criteria is set by the State of California, and they all report under the same set of rules. 
  • Reply 40 of 51
    gatorguy said:
    MplsP said:
    Waiting for the slough of posts calling this story troll-bait, uninformed, etc...

    I have no idea how accurate this is, but given the fact that Apple is later to the self-driving car game than other players, it would not be surprising if it's true. They're just further back on the development curve.
    It's completely accurate, coming from mandatory reports filed with the DMV. 
    I suspect that Apple is interpreting the regulation very, very differently from everyone else.

    Look at the summary of the raw data in the last table on this page:  https://thelastdriverlicenseholder.com/2019/02/12/disengagement-reports-2018-preliminary-results/

    Apple reported 69,510 disengagements.  The other companies, combined, reported 4,040.

    Given that the company has to provide a written description of each disengagement, my theory is that most companies use a restrictive definition of "when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test drive disengage the autonomous mode and take immediate manual control of the vehicle." (emphasis mine)  I theorize that Apple set up an automated reporting process to submit a report every time the driver took control for any reason--essentially "spamming" the system with tens of thousands of reports.  In other words, I expect the reports filed by Apple compared to those filed by, for example, GM Cruise (86 reports with 3 times the cars and 5 times the miles as Apple) are, forgive me, apples to oranges.

    Frankly, I expect Apple is the one reporting "wrong" by being too loose with its definition of "failure" and when "safe operations requires immediate human intervention."  But I suppose it's theoretically possible that Apple is both the most fastidious about reporting and the absolute worst in terms of "failures" and safety.  I just doubt it.


    edited February 2019 patchythepirate
Sign In or Register to comment.