Google to reportedly partner with Ford on self-driving vehicle project

2»

Comments

  • Reply 21 of 26
    maestro64maestro64 Posts: 5,043member

    I said this before about Self Driving cars, they really will not work unless all cars are self driving even then there are still risks. Humans are allow to make mistakes but computers and companies are not allowed.

    I saw this somewhere else so I will not take credit, however this sums up what I have been saying about these cars. I know Google claims the cars have never been in an accident but clarifies their statement by saying others have run into the car. Which brings up the question, will the Google car kill you to save someone else. Like someone crossing the road illegally, or better yet attempt to avoid an accident and goes off the road and hit a person on the side of road. How about when a deer jumps in the middle of road will it hit the deer as your insurance company says you should do or will it try to avoid the deer and crashes you into a tree and kills you. Who win the animal or you, I guess if the programmer is PETA member you maybe dead.

    When the car kills you who do you sue the deer or Google who program the car or Ford who made the car or maybe your local Ford Dealer.

     0Likes 0Dislikes 0Informatives
  • Reply 22 of 26
    I think it is a smart move for Google.  Partnering with an established automaker will have huge benefits getting the technology on the road.  Shame on Ford for milking oil profits for so many years and sacrificing innovation.  They should have had a full electric car 10 years ago and should already be rolling out an autonomous version.
    They've had a fully-electric Focus available for a few years now. It's no Tesla, but it's something few other manufacturers offer. 
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 26
    mac_128mac_128 Posts: 3,454member
    maestro64 said:

    I said this before about Self Driving cars, they really will not work unless all cars are self driving even then there are still risks. Humans are allow to make mistakes but computers and companies are not allowed.

    I saw this somewhere else so I will not take credit, however this sums up what I have been saying about these cars. I know Google claims the cars have never been in an accident but clarifies their statement by saying others have run into the car. Which brings up the question, will the Google car kill you to save someone else. Like someone crossing the road illegally, or better yet attempt to avoid an accident and goes off the road and hit a person on the side of road. How about when a deer jumps in the middle of road will it hit the deer as your insurance company says you should do or will it try to avoid the deer and crashes you into a tree and kills you. Who win the animal or you, I guess if the programmer is PETA member you maybe dead.

    When the car kills you who do you sue the deer or Google who program the car or Ford who made the car or maybe your local Ford Dealer.

    That's going to be the biggest stumbling block for autonomous vehicles -- assessing liability. More lives will be saved because the car can react faster than people can, but in some cases there's going to be collateral damage within the scope of what the software allows the car to do that might not have occurred had a human been in control. And all of those scenarios are going to have to be encoded into the software, and no designer will ever be able to second guess every conceivable situation in advance of them happening. I mean, you're driving down the road at 80 mph and a girder falls off the truck in front of you, there's no possible way to avoid it without changing lanes, and even then, unless the car knows precisely how the girder is going to land, how to further maneuver to protect the occupants, and other vehicles ... And what if it happens if there's a car in every lane? These are real scenarios that will have to be addressed. And forget passengers, what does the car do when it's driving without any passengers, say to pick up its owner after recharging during a concert? My favorite conundrum, is the front tire hits a pothole and blows out, if the car goes left a nun dies, if it goes right, a boy scout dies, if it goes straight the occupant dies, if it stops, both the occupant and following vehicle's occupants die, so what does the car do? At the end of the day, the software is telling the car how to react, so clearly the software company is at the top of the liability list. And what happens if the software steers the vehicle such that it is damaged and physically unable to comply with the software's instructions. Who's at fault then? I really don't see these cars appearing on the road until the insurance companies, state and federal governments, law enforcement, and car manfactures all come up with an acceptable plan.
    edited December 2015
    diplication
     1Like 0Dislikes 0Informatives
  • Reply 24 of 26
    steviestevie Posts: 956member
    This is nothing.  The Apple car will come out and it will be 5 years ahead of everybody else.
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 26
    gatorguygatorguy Posts: 24,772member
    stevie said:
    This is nothing.  The Apple car will come out and it will be 5 years ahead of everybody else.
    Ah, there ya go. Apple is skating to where the puck will be. (You left that part out)
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 26
    Okay, somebody remind me. Why is it that I want a self-driving car?  I understand the desirability of certain high tech features - active braking for collision avoidance, adaptive cruise control, blind spot monitoring - but I thought I enjoyed driving a car.  There just seems to be so much potential for problems.  I mean no software now is ever released with any bugs or security issue, right?  And no one has ever gotten misdirected by their navigation?  With my luck, my car gets stuck in a software loop and I'm stuck in traffic trying to reboot my car.  
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.