Apple lobbies California DMV to keep self-driving car plans out of public eye

Posted:
in Future Apple Hardware
A letter from Apple to the California Department of Motor Vehicles is seeking alterations in test protocols and reporting requirements for automated vehicle testing, and also declares that the company is seeking to increase public acceptance of self-driving cars.




The letter from Apple, first spotted by Reuters is written by Director of Product Integrity Steve Kenner. It leads by seeking clarifications in what the DMV calls "disengagement reporting."

"Apple believes that public acceptance is essential to the advancement of automated vehicles. Access to transparent and intuitive data on the safety of the vehicles being tested will be central to gaining public acceptance," writes Kenner. "However, the current and proposed disengagement reporting requirements do not achieve this result."

Apple is requesting that disengagement reporting be tightened up, and exclude planned tests, the end of testing, operational constraints where the safety driver disengages the system, or other discretionary decisions made by the safety driver not made to prevent an immediate accident. Apple also requests that it not be required to perform "speculation about future events that have not occured" in describing the type of incident that would have happened without the disengagement.

According to the letter, a disengagement should only be defined as an unexpected event or failure that requires the safety driver to take control of the vehicle in order to prevent a crash or traffic violation.

Apple also requests clarification in language surrounding a safety driver's role during testing and development, plus a withdrawal of duplicate language excluding commercial vehicles from testing in multiple sections of the test authorization.

All of the Apple requests seem oriented to report less in the way of interactions by the autonomous system to California, and to the population at large. This doesn't appear represent any ulterior motive on Apple's part, beyond not giving competition a leg-up on the company's plans or information on technology through public filings required by law, such as this letter.



On April 21, a report revealed snippets of Apple's California Department of Motor Vehicles self-driving car application, offering insight into the company's autonomous vehicle project. Apple's full application was revealed a bit later that day, and incorporates a copy of the testing process it used to certify the six drivers who will pilot three modified 2015 Lexus RX450h SUVs, which have since been spotted on the road.

Included in the informational packet are instruction sets, training goals and diagrams for each of the following tests: low speed driving, high speed driving, tight U-turns, sudden steering input, sudden acceleration, sudden braking and lane change. Three drivers have worked at Apple's Special Projects Group for two years as hardware and software engineers, according to their LinkedIn profiles.

"Pilots" are expected to pass seven rudimentary tests prior to taking the testbed out for data gathering drives. Tests listed include low speed and high speed driving, as well as drive system intervention including tight U-turns, sudden steering input, sudden acceleration and sudden braking. Drivers also need to take action in the case of faulty software lane change requests called a "conflicting turn signal and action."

Apple expects its test system to be capable of maintaining in-lane speeds of at least 65 miles per hour, change lanes automatically, brake when required, and perform other basic functions.

Apple has long been rumored to be working on autonomous vehicle technology under the "Project Titan" aegis. The company reportedly abandoned efforts to create a full car from bumper to bumper in late 2016 when former project leader Steve Zadesky left Apple and handed the reins over to senior VP of Hardware Engineering Dan Riccio.

Project Titan was later transferred to longtime executive Bob Mansfield, who subsequently culled hundreds of employees and refocused the program on self-driving software and supporting hardware.

Comments

  • Reply 1 of 14
    Weird. The headline and the story seem to be in conflict in at least their tone. :/
    Soli
  • Reply 2 of 14
    seafoxseafox Posts: 86member
    Weird. The headline and the story seem to be in conflict in at least their tone. :/
    Welcome to journalism in the 21st Century.
    SpamSandwichboltsfan17[Deleted User]
  • Reply 3 of 14
    macseekermacseeker Posts: 439member
    I wish AppleInsider would use another delivery service for document sharing instead of Scribd. I don't want to sign up for the Scribd service.
    SpamSandwichpscooter63
  • Reply 4 of 14
    SpamSandwichSpamSandwich Posts: 31,190member
    I doubt Apple will be given any special treatment in this regard.
  • Reply 5 of 14
    gatorguygatorguy Posts: 20,742member
    Weird. The headline and the story seem to be in conflict in at least their tone. :/
    I agree. It doesn't help public transparency to hide software failures that require a human driver to take control, or to prevent that driver from surmising what would have happened if he/she had not intervened. I think it has less to do with not revealing stuff to competitors and more to do with not scaring the public with "disengagement reports" that can't be proven to lead to damage to life or property. 

    FWIW I think the rules as they stand do more for transparency and help the public understand both the reliability along with vehicle safety with the currently testing autonomous systems. If the public were to think companies were hiding failures it will make them less trusting in those systems, not more. 

    Leave the rules as they are now IMHO, at least until the public has become comfortable with believing self-driving tech is safe and dependable.  That's gonna take awhile. 
    SpamSandwichandrewj5790
  • Reply 6 of 14
    Mike WuertheleMike Wuerthele Posts: 4,634administrator
    macseeker said:
    I wish AppleInsider would use another delivery service for document sharing instead of Scribd. I don't want to sign up for the Scribd service.
    If you view the story from the homepage, the document should be displayed without registration.
  • Reply 7 of 14
    zoetmbzoetmb Posts: 2,453member
    If you read the specifics, they seem pretty reasonable (and I'm someone who does not favor driverless vehicles and who believes they will fail).  Apple's requests are as follows:

    A disengagement should be defined as an unexpected event or failure that requires thsafety driver to take control of the vehicle in order to prevent a crash or traffic
     violation. 
    •A disengagement should not be reported for the followingOperational constraints where either the safety driver has been trained to disengage the system, or when the system detects the constraint and disengages automatically. For example, a system that requires the safety driver to navigate through a construction zone. System errors or failures. For example, a software bug or sensor dropout that does not affect the safe operation of the system. •Discretionary decisions made by the safety driver. For example, when the safety driver perceives a vehicle is approaching too quickly and opts to disengage the system. •Any tests that are planned to result in a disengagement. • The end of a test or experiment. Additionally, the proposed requirement in §227.50(b)(3)(B)(vi) to describe the type of incident that would have happened without the disengagement should be removed. It requires speculation about future events that have not occurred. 

    And they're not requesting special treatment.  This would change for every company testing such vehicles.   This all seems intended to avoid scaring the public (or politicians) with press like, "the testers had to disengage the auto driving system 536 times to avoid having an accident" and then some idiot newscaster making some wiseass remark of how his Mac or iPhone failed, so how could this possibly work.   

    I've always thought that as soon as there are a few serious accidents with such vehicles, even if they happen at a lower rate of incidence than with conventional cars, people will freak out, politicians will get involved because it's an easy issue for idiots to understand and for politicians to express outrage and there will be communities that will either ban these vehicles or ban the use of the automated features, just as some locales ban the use of radar detectors.  And then it will be all over.  Also, I've never been convinced that macho drivers (of any gender) would want to use these systems, especially if the systems force one to drive under the speed limit. 

    On the other hand, if they force vehicles to keep a safe distance from each other and they have effective auto-breaking systems at short distances, they could really reduce the number of accidents.   Almost every accident I've ever seen was because of tailgating, which the police rarely give tickets for.    This really annoys me because driving 5-10 miles over the speed limit on a highway is actually pretty safe, but tailgating isn't.
    Soli
  • Reply 8 of 14
    I'm really dismayed at the lack of understanding of Apple's letter, and the future of autonomous driving.

    By 2025 Legislatures across the country will pass legislation severely limiting sales of fossil fuel vehicles, with a new sale phrase out target of 2030. Fossil fuel passenger vehicles will be outlawed (excepting historical and special use vehicles) by 2035.

    Replacement vehicles will be electric and will be autonomous.  Autonomous is already demonstrating, at this early stage, that autonomous is significantly safer than human drivers.

    The macho driver is going to have to find another outlet for his/her aggression.

    Futurists are forecasting that passenger vehicle ownership is going to die.  Auto manufacturers will offer transportation services, not ownership.

    Being legally blind (last 5 years) i can attest to the fact that using Uber (my primary mode of transportation) is less expensive than owning a car and much more convenient.

    Electric and autonomous are inevitable, so the doubters had better get used to the idea.
  • Reply 9 of 14
    macseeker said:
    I wish AppleInsider would use another delivery service for document sharing instead of Scribd. I don't want to sign up for the Scribd service.
    If you view the story from the homepage, the document should be displayed without registration.
    How about downloading the document without registration?  Not possible.
  • Reply 10 of 14
    zoetmb said:
    If you read the specifics, they seem pretty reasonable (and I'm someone who does not favor driverless vehicles and who believes they will fail).  Apple's requests are as follows:

    A disengagement should be defined as an unexpected event or failure that requires thsafety driver to take control of the vehicle in order to prevent a crash or traffic
     violation. 
    •A disengagement should not be reported for the followingOperational constraints where either the safety driver has been trained to disengage the system, or when the system detects the constraint and disengages automatically. For example, a system that requires the safety driver to navigate through a construction zone. System errors or failures. For example, a software bug or sensor dropout that does not affect the safe operation of the system. •Discretionary decisions made by the safety driver. For example, when the safety driver perceives a vehicle is approaching too quickly and opts to disengage the system. •Any tests that are planned to result in a disengagement. • The end of a test or experiment. Additionally, the proposed requirement in §227.50(b)(3)(B)(vi) to describe the type of incident that would have happened without the disengagement should be removed. It requires speculation about future events that have not occurred. 

    And they're not requesting special treatment.  This would change for every company testing such vehicles.   This all seems intended to avoid scaring the public (or politicians) with press like, "the testers had to disengage the auto driving system 536 times to avoid having an accident" and then some idiot newscaster making some wiseass remark of how his Mac or iPhone failed, so how could this possibly work.   

    I've always thought that as soon as there are a few serious accidents with such vehicles, even if they happen at a lower rate of incidence than with conventional cars, people will freak out, politicians will get involved because it's an easy issue for idiots to understand and for politicians to express outrage and there will be communities that will either ban these vehicles or ban the use of the automated features, just as some locales ban the use of radar detectors.  And then it will be all over.  Also, I've never been convinced that macho drivers (of any gender) would want to use these systems, especially if the systems force one to drive under the speed limit. 

    On the other hand, if they force vehicles to keep a safe distance from each other and they have effective auto-breaking systems at short distances, they could really reduce the number of accidents.   Almost every accident I've ever seen was because of tailgating, which the police rarely give tickets for.    This really annoys me because driving 5-10 miles over the speed limit on a highway is actually pretty safe, but tailgating isn't.
    However, and today, 47 states have individually passed laws to make it a requirement to collect phone data after a car accident to determine if the operator was text messaging. It's a worthwhile request that software logs and decision trees leading up to to any accident or damage of property also be available to law enforcement.  

    So politicians will *certainly* get involved. However, as a country and nation, we've gone through these types of open reviews before when the country adopted the combustion engine and changed from a country that relied on horses for transportation. There is nothing propietary with developing human-assist software for use on publicly-funded and taxpayer supported roads.

    When switching from horse and buggy to combustion engine technology, The Federal Government took a role in developing standards for vehicles. However, the individual states issued licenses based on locally agreed-upon standards, funding available (often California runs a tax deficit) and ultimately, the local and city governments enforced the actions a vehicle operator ultimately takes. There is no one-size-fits all ideology behind developing software to be used on a national scale, otherwise as a country, a driver's license would be issued by the Federal Government. It's wrong to tell Government what it's supposed to do.  

    As for these multiple competing standards, I see it differently. Driver Assist needs to be a NEW TYPE of license. It's as simple as a CDL license versus a Standard Driver's License. A CDL operator's responsibilities are DIFFERENT than someone who drives a sedan and responsibilities change with the advent of this technology assistance software.  

    Today the FAA has higher standards; and perhaps that's the best example to mirror. Someone who is licensed to fly a Boeing 767 Airplane CAN NOT also be licensed to fly an Airbus until they complete appropriate training, and have satisfied licensure for a "Type Rating" that even dictates what model aircraft a pilot can operate. Perhaps with multiple companies creating multiple standards, California should adopt a similar system to the FAA.  

    Because of the issues, landgrab about patents, logs, performance data and decision trees software takes should be open to review. It's absolutely important that the data is disclosed during an accident when an insurance claim is filed.  

    More importantly, and because the very definition of a Vehicle Operator is changing; this information should be available to the operator/owner as well. Local governments need to issue an appropriate test so an operator can be licensed for technology-assisted vehicles; possibly based on version.
    edited April 2017
  • Reply 11 of 14
    SoliSoli Posts: 8,970member
    1) What's up with that headline? Am I crazy or does it read the exact opposite of what Apple is proposing in the article?

    2) This is great and this is why I respect Apple
  • Reply 12 of 14
    gatorguygatorguy Posts: 20,742member
    Soli said:
    1) What's up with that headline? Am I crazy or does it read the exact opposite of what Apple is proposing in the article?

    2) This is great and this is why I respect Apple
    Soli, I think Apple would prefer NOT to be required to publicly report certain failures as all companies testing autonomous vehicles on public highways are required to. 

    Example: A software failure of some type causes the human driver to take control of the vehicle but no accident or injury results.
       - Currently companies are required to report it as a "disengagement" and it becomes public record.
       - Under Apple's proposal they do not have to report it. No public disclosure.

    Example: A human driver makes the decision he needs to take control to avoid a possible "incident"
       - Currently they are required to report it as a "disengagement" and the driver has to make a report on what he believes would have happened had he not taken control. It becomes public record.
       - Under Apple's proposal it would not be reported as such. Further Apple says any driver report on why he took control would be conjecture and thus inappropriate. No Driver report on the incident would be filed for review, no public disclosure.

    Those are just two obvious changes that IMHO will lead to less transparency, quite the opposite of what Apple suggests as the reason for the changes they're requesting. Hiding what might be pertinent safety and/or reliability issues from consumers and potential riders is not the way to build trust in the system in my view. 

    No doubt every company testing self-driving cars would LOVE to avoid reporting some of their failures. It would make things look so much rosier (and perhaps they are). and if Apple gets its way neither Google nor GM nor Audi nor BMW nor Didi will complain about it because they won't have to report those things either. But it's not a benefit to the public who I believe really should know both the good and bad, all of it, before committing themselves to trust in self-driving technologies. 
    edited April 2017
  • Reply 13 of 14
    SoliSoli Posts: 8,970member
    gatorguy said:
    Soli said:
    1) What's up with that headline? Am I crazy or does it read the exact opposite of what Apple is proposing in the article?

    2) This is great and this is why I respect Apple
    Soli, I think Apple would prefer NOT to be required to publicly report certain failures as all companies testing autonomous vehicles on public highways are required to. 

    Example: A software failure of some type causes the human driver to take control of the vehicle but no accident or injury results.
       - Currently companies are required to report it as a "disengagement" and it becomes public record.
       - Under Apple's proposal they do not have to report it. No public disclosure.

    Example: A human driver makes the decision he needs to take control to avoid a possible "incident"
       - Currently they are required to report it as a "disengagement" and the driver has to make a report on what he believes would have happened had he not taken control. It becomes public record.
       - Under Apple's proposal it would not be reported as such. Further Apple says any driver report on why he took control would be conjecture and thus inappropriate. No Driver report on the incident would be filed for review, no public disclosure.

    Those are just two obvious changes that IMHO will lead to less transparency, quite the opposite of what Apple suggests as the reason for the changes they're requesting. Hiding what might be pertinent safety and/or reliability issues from consumers and potential riders is not the way to build trust in the system in my view. 

    No doubt every company testing self-driving cars would LOVE to avoid reporting some of their failures. It would make things look so much rosier (and perhaps they are). and if Apple gets its way neither Google nor GM nor Audi nor BMW nor Didi will complain about it because they won't have to report those things either. But it's not a benefit to the public who I believe really should know both the good and bad, all of it, before committing themselves to trust in self-driving technologies. 
    After reading your post I re-read the article. I was focusing Apple's quoted statement, "Apple believes that public acceptance is essential to the advancement of automated vehicles. Access to transparent and intuitive data on the safety of the vehicles being tested will be central to gaining public acceptance," and the comment about Apple wanting it "tightened up" which I interpreted to mean more structure with transparency.

    I current y recant the 2nd comment in my previous post. I'm all for more transparency and I don't think that "disengagement" will have any longterm affects on this technology. The biggest hurdle is likely going to be from luddites who believe that they're better active drivers than an autonomous vehicle with dozens of active sensors and redundant systems. These people will eventually die, to paraphrase Jobs, and the next generation will understand the safety and efficiency aspects of autonomous vehicles with less irrational trepidation.

    While I don't think Apple's request is wholly unrealistic, I do think that less transparency will likely lead to more suspicion. I'd personally prefer to have all the data about software failures available to everyone, not just when there's an accident. What does the FAA do when there's a SW issue with a commercial jetliner that is still competently flown by its pilots? Do they keep it a secret?
  • Reply 14 of 14

    I have to say I disagree with Apple on this one.  

    Reporting "disengagements", irrespective of the reason, is good.  Analysis of the disengagement, whatever its cause and however innocuous, will lead to better software, and fewer disengagements for any reason.  The "pilot's" speculation about possible consequences of failing to disengage are also good, as not only will it lead to better software, it might also lead to insights into the psychology of autonomous vehicles and result in software changes that make people more comfortable with the technology.

    For example (and this is just an example; I have no idea whether or not anybody has incorporated something like this into any current software), if the pilot disengages because he sees another driver acting "weirdly", but not strangely enough for the software to pick it up, that might be a very subjective or intuitive judgment on the part of the driver, but it offers a chance to investigate improving the software to better detect such things.

    And while I trust Apple to not ignore those innocuous disengagements even if they aren't reported, I definitely do not trust most of the other entities working on this technology to do so.

    I'm also aware that some will report these disengagements as failures, even when they are nothing of the sort.  Hiding them is not the answer to that problem, in my opinion.

    Soli
Sign In or Register to comment.