Proposed California rules would require self-driving Apple car to have human backup during testing

Posted:
in General Discussion
Draft rules proposed by the California Department of Motor Vehicles would mandate that all self-driving vehicles -- including Apple's -- have a human driver at the helm at all times, among other stipulations.




The rules are still subject to a period of public commentary before any approval, Reuters noted on Wednesday. Companies would also have to pass certifications, including third-party testing, and submit regular reports to the DMV for three years, while also obtaining approval to collect any data beyond safety systems.

Although imposing demands on carmakers, the rules are also meant to set out a firm direction for testing self-driving technology. The DMV's data will be used to guide future regulations.

At the moment 11 companies have permits to test the technology on California roads, all of which are already required to have human backups.

Any regulations will almost certainly affect Apple, which is believed to be working on a self-driving car for release in 2019 or 2020. Although Apple is likely to test in secrecy for as long as possible, any practical product will have to be put through government evaluation and real-world driving conditions, exposing it to the public early on.
«1

Comments

  • Reply 1 of 29
    melgrossmelgross Posts: 33,510member
    What does the author mean by saying "including Apple"? Of course, including Apple. Why would Apple be special in any way?
  • Reply 2 of 29
    melgross said:
    What does the author mean by saying "including Apple"? Of course, including Apple. Why would Apple be special in any way?
    The journalist fails to make the obvious point explicit: If the rule is enacted, the records of meeting this requirement would be a public record. Thus, Apple would be confirming it is somehow working on making a self driving car.

    But...I can then see Apple deciding to not drive it in CA for testing so as not needing to comply.
    SpamSandwich
  • Reply 3 of 29
    sflocalsflocal Posts: 6,093member
    They can simply choose to do it outside of California (like New York?) where for now, they're not regulating that part, and if the kinks are worked out and deemed safe, I see no reason why California would need to keep that law.

    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
  • Reply 4 of 29
    jdwjdw Posts: 1,334member
    Which of course is safe but still makes driverless cars a dream.  We need to eliminate silly driver's licenses.  These rules would retain that.  Furthermore, while companies wouldn't like it, liability in case of accidents with driverless cars with no driver at the helm would fall on the car company, rather than anyone inside the car.  The elimination of driver liability (since the car would be the driver) would be big and have a big impact on insurance that we "riders in cars" would or wouldn't have to pay.  THAT, is the real future of driverless cars.
    genovelle
  • Reply 5 of 29
    entropysentropys Posts: 4,163member
    Of course it will need a human to take over. Otherwise driverless cars will become the new overlords via skynet.
    Seriously though, it reminds me of those early laws that required someone to walk in front of a horseless carriage to warn the populace.  It would also make a potential taxi service/uber disrupter using driverless cars not very disruptive. 
    edited December 2015
  • Reply 6 of 29
    melgross said:
    What does the author mean by saying "including Apple"? Of course, including Apple. Why would Apple be special in any way?

    Because otherwise, there's no article to write.....
    chris_cadamn_its_hot
  • Reply 7 of 29
    mac_128mac_128 Posts: 3,454member
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    badmonk
  • Reply 8 of 29
    melgrossmelgross Posts: 33,510member
    eightzero said:
    melgross said:
    What does the author mean by saying "including Apple"? Of course, including Apple. Why would Apple be special in any way?
    The journalist fails to make the obvious point explicit: If the rule is enacted, the records of meeting this requirement would be a public record. Thus, Apple would be confirming it is somehow working on making a self driving car.

    But...I can then see Apple deciding to not drive it in CA for testing so as not needing to comply.
    We can be sure that every other state that allows autonomous cars to be tested will follow this rule in their own. It makes sense, after all. Does anyone really want these things on the road at this early point in time without a driver as backup? I sure don't!
    Prof_Peabodyjfc1138
  • Reply 9 of 29
    melgrossmelgross Posts: 33,510member
    mac_128 said:
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    The reality is that no car has ever been hacked. That is, the only cars that have been hacked have been owned by the researchers doing the hacking. No other car has ever been hacked. The researchers have had to get into the electronics physically to hack it. In one of the two cases, they needed a laptop connected to the diagnostic port when the car was in use in order to control it. This is another situation where people are being frightened by something that hasn't happened in reality, and is very unlikely to happen.
    sflocal said:
    They can simply choose to do it outside of California (like New York?) where for now, they're not regulating that part, and if the kinks are worked out and deemed safe, I see no reason why California would need to keep that law.

    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    There have been a lot of problems with these cars. Google has software engineers riding in every car. When odd situations come up, and I'll name a couple, they can get out of trouble right away, and even do some on the fly reprogramming.

    A problem was a guy standing on a corner wearing a t-shirt saying STOP. The various sensors in the car couldn't distinguish whether this was a real stop sign or not, and refused to go.
     
    Another problem was with another guy on a bicycle at a corner. As many bikers do, he was moving back and forth a bit on his bike waiting for the light to change. The car had the right of way, but just moved forward an inch when the bike moved backwards, and stopped when it moved forwards. The car went nowhere.

    There have been many problems like these two. They didn't result in accidents, but they did result in the cars going nowhere, and holding up traffic.

    edited December 2015
  • Reply 10 of 29
    sflocal said:
    They can simply choose to do it outside of California (like New York?) where for now, they're not regulating that part, and if the kinks are worked out and deemed safe, I see no reason why California would need to keep that law.

    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    I'm betting it's because a looming future without driver licenses being issued by the state results in hundreds of millions in government revenue disappearing. California is awash is regulations which favor unions and government employees and every punitive tax or law enacted in California serves another special interest (almost never the taxpayers). It's like the Soviet Union with sunshine and beaches here.
    edited December 2015
  • Reply 11 of 29
    mac_128 said:
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    Yes, blindly put your trust into Apple. 
  • Reply 12 of 29
    melgross said:
    mac_128 said:
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    The reality is that no car has ever been hacked. That is, the only cars that have been hacked have been owned by the researchers doing the hacking. No other car has ever been hacked. The researchers have had to get into the electronics physically to hack it. In one of the two cases, they needed a laptop connected to the diagnostic port when the car was in use in order to control it. This is another situation where people are being frightened by something that hasn't happened in reality, and is very unlikely to happen.
    sflocal said:
    They can simply choose to do it outside of California (like New York?) where for now, they're not regulating that part, and if the kinks are worked out and deemed safe, I see no reason why California would need to keep that law.

    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    There have been a lot of problems with these cars. Google has software engineers riding in every car. When odd situations come up, and I'll name a couple, they can get out of trouble right away, and even do some on the fly reprogramming.

    A problem was a guy standing on a corner wearing a t-shirt saying STOP. The various sensors in the car couldn't distinguish whether this was a real stop sign or not, and refused to go.
     
    Another problem was with another guy on a bicycle at a corner. As many bikers do, he was moving back and forth a bit on his bike waiting for the light to change. The car had the right of way, but just moved forward an inch when the bike moved backwards, and stopped when it moved forwards. The car went nowhere.

    There have been many problems like these two. They didn't result in accidents, but they did result in the cars going nowhere, and holding up traffic.

    So, if your neighbor pisses you off, you can stand at the corner of his house with a "Detour" shirt and have all incoming cars park in his back yard. The future of pranking looks bright indeed :D
  • Reply 13 of 29
    pmzpmz Posts: 3,433member
    It truly amazes how out of touch the blogosphere is with reality, when it comes to this topic.

    idiotic statements like this one:

    Itjdw said:
     We need to eliminate silly driver's licenses.  

    ...are just an example of how people that follow articles like this are completely on another planet from the rest of the world, where driverless cars are total fantasy, not happening in this lifetime, and would not be welcomed by 90% of people that drive.

    It is worth working on from a technical perspective, but for the real world it is DOA for at least this next generation.


  • Reply 14 of 29
    melgross said:
    mac_128 said:
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    The reality is that no car has ever been hacked. That is, the only cars that have been hacked have been owned by the researchers doing the hacking. No other car has ever been hacked. The researchers have had to get into the electronics physically to hack it. In one of the two cases, they needed a laptop connected to the diagnostic port when the car was in use in order to control it. This is another situation where people are being frightened by something that hasn't happened in reality, and is very unlikely to happen.

    Of course cars have been hacked.  This is way beyond research as well.  In the article below Jeeps were hacked with NO PHYSICAL access via the airwaves.  Doors have been unlocked and this has been used in actual cases of theft. 

    http://www.bloomberg.com/news/articles/2015-07-31/hacked-jeep-cherokee-exposes-weak-underbelly-of-high-tech-cars

  • Reply 15 of 29
    melgrossmelgross Posts: 33,510member
    beltsbear said:
    melgross said:
    mac_128 said:
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    The reality is that no car has ever been hacked. That is, the only cars that have been hacked have been owned by the researchers doing the hacking. No other car has ever been hacked. The researchers have had to get into the electronics physically to hack it. In one of the two cases, they needed a laptop connected to the diagnostic port when the car was in use in order to control it. This is another situation where people are being frightened by something that hasn't happened in reality, and is very unlikely to happen.

    Of course cars have been hacked.  This is way beyond research as well.  In the article below Jeeps were hacked with NO PHYSICAL access via the airwaves.  Doors have been unlocked and this has been used in actual cases of theft. 

    http://www.bloomberg.com/news/articles/2015-07-31/hacked-jeep-cherokee-exposes-weak-underbelly-of-high-tech-cars

    Door locking mechanisms have been unlocked for a couple of decades, and have nothing to do with the type of actual hacking we're being told about today.
  • Reply 16 of 29
    raz0r said:
    melgross said:
    mac_128 said:
    sflocal said:
    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    The reality is that no car has ever been hacked. That is, the only cars that have been hacked have been owned by the researchers doing the hacking. No other car has ever been hacked. The researchers have had to get into the electronics physically to hack it. In one of the two cases, they needed a laptop connected to the diagnostic port when the car was in use in order to control it. This is another situation where people are being frightened by something that hasn't happened in reality, and is very unlikely to happen.
    sflocal said:
    They can simply choose to do it outside of California (like New York?) where for now, they're not regulating that part, and if the kinks are worked out and deemed safe, I see no reason why California would need to keep that law.

    I think it's inevitable that driverless cars will be the norm.  I think California is just thinking that these cars will run amok and start running over pedestrians.  Surely, the self-drive folks have implemented safeguards to prevent such a thing?
    There have been a lot of problems with these cars. Google has software engineers riding in every car. When odd situations come up, and I'll name a couple, they can get out of trouble right away, and even do some on the fly reprogramming.

    A problem was a guy standing on a corner wearing a t-shirt saying STOP. The various sensors in the car couldn't distinguish whether this was a real stop sign or not, and refused to go.
     
    Another problem was with another guy on a bicycle at a corner. As many bikers do, he was moving back and forth a bit on his bike waiting for the light to change. The car had the right of way, but just moved forward an inch when the bike moved backwards, and stopped when it moved forwards. The car went nowhere.

    There have been many problems like these two. They didn't result in accidents, but they did result in the cars going nowhere, and holding up traffic.

    So, if your neighbor pisses you off, you can stand at the corner of his house with a "Detour" shirt and have all incoming cars park in his back yard. The future of pranking looks bright indeed :D
    I like your thinking. Reminds me of a Wiley E Coyote vs. Roadrunner Loony Tunes show.

    Or as long as pedestrians always wear clothes with STOP sign logos on the front and back we should all be safe.
  • Reply 17 of 29
    eightzeroeightzero Posts: 3,063member
    I enjoyed a picture someone had of a google car stopped on a 2 lane road with a double yellow. The car in front of the google car was parked illegally blocking the lane. Now what? A human driver breaks the law by going over the double yellow. A google car can't be so programmed. It would sit there forever.

    My prediction: we will see self driving trucks on specific, established long haul routes to special depots on interstate highways first. Union condemnation in 3...2...1...
    edited December 2015
  • Reply 18 of 29
    mac_128mac_128 Posts: 3,454member
    melgross said:
    beltsbear said:
    Of course cars have been hacked.  This is way beyond research as well.  In the article below Jeeps were hacked with NO PHYSICAL access via the airwaves.  Doors have been unlocked and this has been used in actual cases of theft. 

    http://www.bloomberg.com/news/articles/2015-07-31/hacked-jeep-cherokee-exposes-weak-underbelly-of-high-tech-cars

    Door locking mechanisms have been unlocked for a couple of decades, and have nothing to do with the type of actual hacking we're being told about today.
    I don't know this Car & Driver Blog seems to disagree. 

    http://blog.caranddriver.com/can-your-car-really-be-hacked-six-points-to-know/

    In particular, the Miller/Valasek Jeep Cherokee wireless hack this past Summer, seems to prove a hack of this nature which takes over the entire vehical is indeed possible. 

    EDIT: just read the linked Bloomberg article ... why are you focusing on the door locking mechanisms? They took over and shut the entire car down wirelessly ... and whether or not it has been successfully accomplished before now, I'm not sure anyone can trust that such a thing won't be possible in the future. The only way to prevent it is to put a physical barrier between the CAN bus and the Infotainment system. And autonomous vehicle manufactures don't seem to be interested in doing that, since piloting the car without a driver requires input from the information gleaned from the web. So it's a concern that must be addressed. 
  • Reply 19 of 29
    melgross said:
    mac_128 said:
    Yes. But it's not the manufacturers we have to worry about. It's the hackers. And the reality is in tests where regular cars have been hacked with drivers behind the wheel, the driver is usually powerless to do anything. That's the real concern with autonomous vehicles. And to that end, I'd put my trust in Apple, which is why you don't put backdoors into a system that has to be secure.
    The reality is that no car has ever been hacked. That is, the only cars that have been hacked have been owned by the researchers doing the hacking. No other car has ever been hacked. The researchers have had to get into the electronics physically to hack it. In one of the two cases, they needed a laptop connected to the diagnostic port when the car was in use in order to control it. This is another situation where people are being frightened by something that hasn't happened in reality, and is very unlikely to happen.
    There have been a lot of problems with these cars. Google has software engineers riding in every car. When odd situations come up, and I'll name a couple, they can get out of trouble right away, and even do some on the fly reprogramming.

    A problem was a guy standing on a corner wearing a t-shirt saying STOP. The various sensors in the car couldn't distinguish whether this was a real stop sign or not, and refused to go.
     
    Another problem was with another guy on a bicycle at a corner. As many bikers do, he was moving back and forth a bit on his bike waiting for the light to change. The car had the right of way, but just moved forward an inch when the bike moved backwards, and stopped when it moved forwards. The car went nowhere.

    There have been many problems like these two. They didn't result in accidents, but they did result in the cars going nowhere, and holding up traffic.

    They are very rarely published (because we all want to enjoy the illusion that self-driving cars are "just around the corner"), but there are many, many, spectacular fails involved with testing self-driving cars.  Despite all the ink spent on self-driving cars almost no one bothers to mention that they are a long way from being ready for prime time and that the whole venture is likely even impossible from the start.  

    Currently, self-driving cars simply don't work.  Period.  That is at least if by "working" it is meant that they can drive themselves in the same way and with the same level of safety and accuracy that the average half-drunk high-school student is able to.  A simple downpour (heavy rain) completely stymies most self-driving cars.  As does darkness, fog, etc. The sensors simply can't keep up and they can't tell where they are in such conditions, or where the road is.  A car that only works on a sunny day on a clear highway in California might be exciting to California residents, but not to anyone else.  

    Not only that, but the more one looks into self-driving cars, the more obvious it becomes that it's essentially a dead end technology, because the problems that need to be overcome are almost completely intractable.  

    The easiest (and possibly the only) solution to the problems self-driving cars face, is simply embedding a strip of metal in the middle of the road for the car to "follow," so that's likely what will eventually be done.  It's cheap and it solves all the problems.  But even though we might still refer to them as such, they won't be "self-driving" cars after that will they?  
  • Reply 20 of 29
    Prof_Peabody said:
    Despite all the ink spent on self-driving cars almost no one bothers to mention that they are a long way from being ready for prime time and that the whole venture is likely even impossible from the start.  
    “WHAT DO YOU MEAN; THERE’S GOING TO BE ONE RELEASED IN 2017 AND TESLA HAS ONE NOW YOU’RE LYING!” is the only response I ever get when I try to tell people this.
Sign In or Register to comment.