NHTSA closes investigation into fatal Tesla Autopilot accident, says no defect to blame

Posted:
in Future Apple Hardware
Eliminating a potential obstacle to the development of self-driving technology such as developments from Apple's "Project Titan," the U.S. National Highway Traffic Safety Administration said on Thursday that there was no evidence a fatal Tesla accident in May was caused by an autonomous driving system defect demanding a recall.




The victim, Joshua Brown, was killed while using the semi-autonomous Autopilot system in his Model S, Reuters noted. In July, the National Transportation Safety Board found that Brown's car was traveling 74 miles per hour in a 65 zone when it smashed into a semi-truck in Florida. At the time weather was dry, with good visibility.

When fully engaged, Autopilot can keep a car driving down a highway with minimal interaction from its owner, keeping its distance from other vehicles and even changing lanes. However, some owners have been caught on camera abusing the technology, barely avoiding accidents as a result of carelessness.

This led to Tesla imposing new restrictions on the technology, including temporarily disabling it if people don't respond to warnings to assume control. More recent models have been upgraded with better sensors, processors, and software, and should in theory be capable of full self-driving once Tesla finishes testing and development.

Had the NHTSA ordered a recall, that could have derailed not just Tesla's self-driving efforts but those of other companies. As a result of a recall safety measures would likely have to be re-evaluated, and the NHTSA would probably introduce new regulations.

Apple's own car efforts, known as Project Titan, are currently in limbo. While the company was once allegedly working on its own design, the Titan team has reportedly narrowed its focus to a self-driving platform until late 2017, at which point it will decide whether to resume its own design work or partner with a third-party automaker.

The project has also had to cope with prominent departures, including a hiring war between Apple and Tesla. Recently the head of Apple's Swift programming team, Chris Lattner, joined Tesla to lead Autopilot development.

One ace down Apple's sleeve may be VP Lisa Jackson holding a seat on the U.S. Department of Transportation's automated vehicle committee, giving it more government influence. No Tesla executive is in the group, despite it being a leader in the field.
«1

Comments

  • Reply 1 of 28
    smalmsmalm Posts: 605member
    So a Tesla killing it's driver works as designed?

  • Reply 2 of 28
    I applaud NHTSA's decision. (I don't own, nor do I plan to own, a Tesla. Nor its stock.)
  • Reply 3 of 28
    slurpyslurpy Posts: 4,679member
    smalm said:
    So a Tesla killing it's driver works as designed?

    Uh, apparently the driver had 7 full seconds to take action to prevent the accident, and he was distracted that whole time, likely watching a movie. I wouldn't blame the car for this one. 
    anantksundaramryan2012gatorguyicoco3
  • Reply 4 of 28
    YvLyYvLy Posts: 15member
    "... there was no evidence a fatal Tesla accident in May was caused by an autonomous driving system defect ... " He? Come again?
  • Reply 5 of 28
    "Had the NHTSA ordered a recall, that could have derailed not just Tesla's self-driving efforts but those of other companies. As a result of a recall safety measures would likely have to be re-evaluated, and the NHTSA would probably introduce new regulations."

    So, re-evaluating safety measures and potentially enacting new regulations intended to reduce fatalities is a negative thing? We as a society will not be able to pat ourselves on the back for our greatness unless self-driving cars are introduced without pesky delays ... plus it was just one dude, after all! </s>
  • Reply 6 of 28
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
  • Reply 7 of 28
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    Yes, the system was unable to deal with stupidy. If you are sinking in quicksand and I throw you a rope that you do not grasp then it is my fault you die?
    ktappeDavidAlGregoryryan2012gatorguyargonauticoco3
  • Reply 8 of 28
    ktappektappe Posts: 686member
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    The system is not being marketed perfect. Drivers are clearly told that there are situations where the system will find it is overwhelmed, and at those times the system will request the driver take over. This guy ignored those warnings and ignored the system's request to take over. And somehow you are still blaming Tesla instead of the inattentive driver?
    DavidAlGregoryryan2012icoco3
  • Reply 9 of 28

    "there was no evidence a fatal Tesla accident in May was caused by an autonomous driving system defect demanding a recall."

    What a world we live in when autonomous driving systems are demanding their own recalls.  Come on Car, get your mind back on driving!  Make your demands on your own time!

    2old4fun
  • Reply 10 of 28
    The article refers to Apple as being a leader in the field. Apple has made or sold not one car and the only telematics technology they have released is CarPlay. Not sure how that even makes them anything but a wannabe 
  • Reply 11 of 28
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    Tessa clearly marketed Autopilot as driver assistance and essentially a beta. Not sure how someone not using something as directed implies liability
  • Reply 12 of 28
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    flootist said:
    Whether it was a 'defect' or not, clearly the system wasn't capable of dealing with what was presented to it. Must be nice, being able to bribe one's way out of manslaughter.
    Tessa clearly marketed Autopilot as driver assistance and essentially a beta. Not sure how someone not using something as directed implies liability
    Well, this is the US - Apple's getting sued for texting drivers causing accidents. The only thing you need for a lawsuit is a lawyer with a pulse. And the pulse might be optional. Either way, this is exactly the point - Tesla did not market this as full autopilot and even advised drivers not to use it as such, IIRC. 

    On a different note, the was a guy in The Netherlands who was saved from an accident when his Tesla braked automatically for him when it sensed an accident before he could see it. On balance, I'd be willing to bet that autonomous driving systems would save more lives than they would loose. That being the case, should we be willing to accept some mistakes for the overall benefit?

    http://www.cnbc.com/2016/12/28/tesla-autopilot-appears-to-predict-accident-in-front-of-it.html
    edited January 19 gatorguy
  • Reply 13 of 28
    The article refers to Apple as being a leader in the field. Apple has made or sold not one car and the only telematics technology they have released is CarPlay. Not sure how that even makes them anything but a wannabe 
    No, it refers to Tesla as being a leader in the field.  Not Apple.
  • Reply 14 of 28
    glynhglynh Posts: 49member
    The fact that the Tesla Autopilot was in control & driving at 74mph in a 65mph zone i.e. speeding is a normal function of the software then? I understand the driver had apparently set cruise control to 74mph two minutes before the collision so I guess the software cannot override the driver even when he is breaking the law?

    I also remember reading this at the time of the accident, quote; "Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said"

    Now that seems like a defect of sorts if it doesn't know the difference between a hulking great articulated trailer and the sky!

    And the current NHTSA report states, quote "The report also notes that Automatic Emergency Braking (AEB) systems, like those installed in the Tesla at the time of the crash, are not designed to avoid “left turn across path” collisions. Instead, the system is designed primarily to avoid rear-end collisions."

    Well thats OK then...

    edited January 19 asdasd
  • Reply 15 of 28
    glynh said:
    The fact that the Tesla Autopilot was in control & driving at 74mph in a 65mph zone i.e. speeding is a normal function of the software then? I understand the driver had apparently set cruise control to 74mph two minutes before the collision so I guess the software cannot override the driver even when he is breaking the law?

    I also remember reading this at the time of the accident, quote; "Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said"

    Now that seems like a defect of sorts if it doesn't know the difference between a hulking great articulated trailer and the sky!

    And the current NHTSA report states, quote "The report also notes that Automatic Emergency Braking (AEB) systems, like those installed in the Tesla at the time of the crash, are not designed to avoid “left turn across path” collisions. Instead, the system is designed primarily to avoid rear-end collisions."

    Well thats OK then...

    If I set cruise control at 150mph can I blame the car manufacturer when I lose control? Or when I drive manually at 74mph in a 65mph zone? Tesla's autopilot doesn't use GPS and doesn't take into account speed limits, which can change without notice, that's the driver's job. It's an assistant, not an autonomous car
  • Reply 16 of 28
    mazda 3smazda 3s Posts: 1,477member
    slurpy said:
    smalm said:
    So a Tesla killing it's driver works as designed?

    Uh, apparently the driver had 7 full seconds to take action to prevent the accident, and he was distracted that whole time, likely watching a movie. I wouldn't blame the car for this one. 
    IIRC, he was watching Harry Potter. Not saying that he should have died, but he should have been paying attention. My 2017 Subaru Outback Limited has EyeSight with automatic cruise control. It is flippin' amazing, but I still keep my eyes on the road.
    MplsPgatorguyargonauticoco3
  • Reply 17 of 28
    stompystompy Posts: 288member
    This incident reminded me of the riveting story of Airfrance 447. Cars will be better drivers sometime soon, but until then, there are going to be  accidents; this won't be the last time the driver is at fault for failing take over for an "autopilot".
    argonaut
  • Reply 18 of 28
    maestro64maestro64 Posts: 3,268member

    Not sure if anyone watch the movie Sully. This sound a lot like what happen there.

    The government was looking for any reason to blame the human for not doing the right things. In most accident investigate they tend to fall toward human error all the time. It would not have been good for the industry if the investigation found fault with the car or the systems. Not saying the driver was not an idiot here, but I think the car should not have allowed the driver to kill themselves especially if the purpose of the system was self driving.

    I said this before, before I would own a car which drives itself I would want to know the cars prime directive, does it protect my life at all cost of do something stupid like let you drive into a truck.

  • Reply 19 of 28
    NHTSA also went further to report that accidents (at least those enough to trigger an airbag) are reduced by nearly 40% for
    Teslas equipped with autopilot, from 1.3 crashes per million to 0.8 per million.   Not "left turn across path" stuff quite yet,
    but injury-level accidents.  This will only get better over time as new subsystems (better interpreted radar, sonar, and vision etc.) kick in.

    https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/
    gatorguy
  • Reply 20 of 28
    smalm said:
    So a Tesla killing it's driver works as designed?

    Huge fail. Get someone experienced to write your material.
    StrangeDays
Sign In or Register to comment.