NHTSA closes investigation into fatal Tesla Autopilot accident, says no defect to blame
Eliminating a potential obstacle to the development of self-driving technology such as developments from Apple's "Project Titan," the U.S. National Highway Traffic Safety Administration said on Thursday that there was no evidence a fatal Tesla accident in May was caused by an autonomous driving system defect demanding a recall.

The victim, Joshua Brown, was killed while using the semi-autonomous Autopilot system in his Model S, Reuters noted. In July, the National Transportation Safety Board found that Brown's car was traveling 74 miles per hour in a 65 zone when it smashed into a semi-truck in Florida. At the time weather was dry, with good visibility.
When fully engaged, Autopilot can keep a car driving down a highway with minimal interaction from its owner, keeping its distance from other vehicles and even changing lanes. However, some owners have been caught on camera abusing the technology, barely avoiding accidents as a result of carelessness.
This led to Tesla imposing new restrictions on the technology, including temporarily disabling it if people don't respond to warnings to assume control. More recent models have been upgraded with better sensors, processors, and software, and should in theory be capable of full self-driving once Tesla finishes testing and development.
Had the NHTSA ordered a recall, that could have derailed not just Tesla's self-driving efforts but those of other companies. As a result of a recall safety measures would likely have to be re-evaluated, and the NHTSA would probably introduce new regulations.
Apple's own car efforts, known as Project Titan, are currently in limbo. While the company was once allegedly working on its own design, the Titan team has reportedly narrowed its focus to a self-driving platform until late 2017, at which point it will decide whether to resume its own design work or partner with a third-party automaker.
The project has also had to cope with prominent departures, including a hiring war between Apple and Tesla. Recently the head of Apple's Swift programming team, Chris Lattner, joined Tesla to lead Autopilot development.
One ace down Apple's sleeve may be VP Lisa Jackson holding a seat on the U.S. Department of Transportation's automated vehicle committee, giving it more government influence. No Tesla executive is in the group, despite it being a leader in the field.

The victim, Joshua Brown, was killed while using the semi-autonomous Autopilot system in his Model S, Reuters noted. In July, the National Transportation Safety Board found that Brown's car was traveling 74 miles per hour in a 65 zone when it smashed into a semi-truck in Florida. At the time weather was dry, with good visibility.
When fully engaged, Autopilot can keep a car driving down a highway with minimal interaction from its owner, keeping its distance from other vehicles and even changing lanes. However, some owners have been caught on camera abusing the technology, barely avoiding accidents as a result of carelessness.
This led to Tesla imposing new restrictions on the technology, including temporarily disabling it if people don't respond to warnings to assume control. More recent models have been upgraded with better sensors, processors, and software, and should in theory be capable of full self-driving once Tesla finishes testing and development.
Had the NHTSA ordered a recall, that could have derailed not just Tesla's self-driving efforts but those of other companies. As a result of a recall safety measures would likely have to be re-evaluated, and the NHTSA would probably introduce new regulations.
Apple's own car efforts, known as Project Titan, are currently in limbo. While the company was once allegedly working on its own design, the Titan team has reportedly narrowed its focus to a self-driving platform until late 2017, at which point it will decide whether to resume its own design work or partner with a third-party automaker.
The project has also had to cope with prominent departures, including a hiring war between Apple and Tesla. Recently the head of Apple's Swift programming team, Chris Lattner, joined Tesla to lead Autopilot development.
One ace down Apple's sleeve may be VP Lisa Jackson holding a seat on the U.S. Department of Transportation's automated vehicle committee, giving it more government influence. No Tesla executive is in the group, despite it being a leader in the field.
Comments
So, re-evaluating safety measures and potentially enacting new regulations intended to reduce fatalities is a negative thing? We as a society will not be able to pat ourselves on the back for our greatness unless self-driving cars are introduced without pesky delays ... plus it was just one dude, after all! </s>
"there was no evidence a fatal Tesla accident in May was caused by an autonomous driving system defect demanding a recall."
What a world we live in when autonomous driving systems are demanding their own recalls. Come on Car, get your mind back on driving! Make your demands on your own time!
On a different note, the was a guy in The Netherlands who was saved from an accident when his Tesla braked automatically for him when it sensed an accident before he could see it. On balance, I'd be willing to bet that autonomous driving systems would save more lives than they would loose. That being the case, should we be willing to accept some mistakes for the overall benefit?
http://www.cnbc.com/2016/12/28/tesla-autopilot-appears-to-predict-accident-in-front-of-it.html
I also remember reading this at the time of the accident, quote; "Against a bright spring sky, the car’s sensors system failed to distinguish a large white 18-wheel truck and trailer crossing the highway, Tesla said"
Now that seems like a defect of sorts if it doesn't know the difference between a hulking great articulated trailer and the sky!
And the current NHTSA report states, quote "The report also notes that Automatic Emergency Braking (AEB) systems, like those installed in the Tesla at the time of the crash, are not designed to avoid “left turn across path” collisions. Instead, the system is designed primarily to avoid rear-end collisions."
Well thats OK then...
Not sure if anyone watch the movie Sully. This sound a lot like what happen there.
The government was looking for any reason to blame the human for not doing the right things. In most accident investigate they tend to fall toward human error all the time. It would not have been good for the industry if the investigation found fault with the car or the systems. Not saying the driver was not an idiot here, but I think the car should not have allowed the driver to kill themselves especially if the purpose of the system was self driving.
I said this before, before I would own a car which drives itself I would want to know the cars prime directive, does it protect my life at all cost of do something stupid like let you drive into a truck.
Teslas equipped with autopilot, from 1.3 crashes per million to 0.8 per million. Not "left turn across path" stuff quite yet,
but injury-level accidents. This will only get better over time as new subsystems (better interpreted radar, sonar, and vision etc.) kick in.
https://electrek.co/2017/01/19/tesla-crash-rate-autopilot-nhtsa/