NHTSA closes investigation into fatal Tesla Autopilot accident, says no defect to blame

2»

Comments

  • Reply 21 of 28
    glynh said:
    The fact that the Tesla Autopilot was in control & driving at 74mph in a 65mph zone i.e. speeding is a normal function of the software then? I understand the driver had apparently set cruise control to 74mph two minutes before the collision so I guess the software cannot override the driver even when he is breaking the law?

    I also remember reading this at the time of the accident, quote; "Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said"

    Now that seems like a defect of sorts if it doesn't know the difference between a hulking great articulated trailer and the sky!

    And the current NHTSA report states, quote "The report also notes that Automatic Emergency Braking (AEB) systems, like those installed in the Tesla at the time of the crash, are not designed to avoid “left turn across path” collisions. Instead, the system is designed primarily to avoid rear-end collisions."

    Well thats OK then...


    Regarding speeding: I think Tesla has stayed out of the business of enforcing laws, and leaves that up to the driver.

    As others have said, Tesla has made it clear that the driver has to be involved while driving - at this point in time it doesn't handle all situations.  It works remarkably well, but if the driver doesn't stay alert, then they have failed, not the car.  Tesla's approach, of fleet learning has given far more gains than the approach Google has taken.  You can quickly tell which roads have been well traveled by Tesla(s) and which ones haven't - and, as Tesla has said, you need to be alert when traveling on the less traveled roads.
    edited January 2017
  • Reply 22 of 28

    "there was no evidence a fatal Tesla accident in May was caused by an autonomous driving system defect demanding a recall."

    What a world we live in when autonomous driving systems are demanding their own recalls.  Come on Car, get your mind back on driving!  Make your demands on your own time!

    New Teslas should automatically cut any entertainment system, any phone calls etc when need arises :p
  • Reply 23 of 28
    birkobirko Posts: 60member
    2old4fun said:
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    Yes, the system was unable to deal with stupidy. If you are sinking in quicksand and I throw you a rope that you do not grasp then it is my fault you die?
    I used to supervise a large team of people doing repetitive computer work. There were a lot of different steps and a lot of little things that they needed to be aware of - a lot like driving. 

    At our request, a new system was built to automate a lot of the tasks. Wonderful system that did all we ask. Problem was QCing revealed almost a 200% increase in error rate over the old system. Turns out that we made it too easy, and workers were finding it harder to maintain focus. We took out some of the automation and error rate decreased.

    The human mind seems to have limits to the amount of stimulation needed to maintain focus. This may be an inherent fault in our nervous system and something that needs to be taken into account by automated car designers. One day automated cars will be safer than human drivers, but getting there needs to be done carefully.
  • Reply 24 of 28
    macmikey3macmikey3 Posts: 2unconfirmed, member
    The article refers to Apple as being a leader in the field. Apple has made or sold not one car and the only telematics technology they have released is CarPlay. Not sure how that even makes them anything but a wannabe 
    They meant no one from tesla, a leader in the field, ... "No Tesla executive is in the group, despite it being a leader in the field." You can stop being an Apple hater now. ;-)
  • Reply 25 of 28
    evilutionevilution Posts: 1,399member
    If you are in the driver's seat, you are driving the car. Watching Harry Potter on a portable DVD player is not driving a car.
    Auto pilot is not autonomous driving, you still need to pay attention to back up the car should it not spot something.
    StrangeDays
  • Reply 26 of 28
    MplsPMplsP Posts: 3,911member
    Saw this article that gives some more information on the 'auto pilot' system and the crash.
    Among the details:
    • The driver set the cruise control to 74
    • The system was an early generation system that wasn't designed to detect a vehicle turning left in front of you (I think the term 'auto pilot' is being generous for such a limited system.)
    • The system used both visual and radar sensor and required agreement between the two to initiate automatic braking.
    • The system as well as Tesla as a company state that it requires the 'continual and full attention' of the driver
    • No confirmation that the driver was watching Harry Potter but the truck driver involved in the accident stated he was
    In general it would appear that the driver was speeding, ignoring the directions for the system and using it to do things for which it wasn't designed. I think the biggest issue is that a limited system breeds complacency

    http://www.theverge.com/2017/1/19/14326604/tesla-autopilot-crash-driver-seven-seconds-inattentive-nhtsa
  • Reply 27 of 28
    asdasdasdasd Posts: 5,686member
    ktappe said:
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    The system is not being marketed perfect. Drivers are clearly told that there are situations where the system will find it is overwhelmed, and at those times the system will request the driver take over. This guy ignored those warnings and ignored the system's request to take over. And somehow you are still blaming Tesla instead of the inattentive driver?
    how will this work for full driverless though?

    A cynic would suggest that this stuff is a beta test of self driving, unfortunately crashes in this code can be fatal. 
  • Reply 28 of 28
    asdasdasdasd Posts: 5,686member
    To my mind any assistance that takes over, but returns control when it is in trouble is useless and counter productive. The driver should always be on the wheel and in control in these cases. 
Sign In or Register to comment.