NTSB lays partial blame on Apple for fatal Tesla crash involving employee

13»

Comments

  • Reply 41 of 57
    DAalsethDAalseth Posts: 2,783member
    FWIW. I don't really blame Tesla either. 
    A driver or pilot is responsible for knowing the capabilities of their vehicle and operating within those limits. If someone skids off a snowy road in a car that has summer tires, you don't blame Ford, you blame the driver for not having the proper tires on his car. If someone takes a 2WD Corvette and gets stuck in the mud it isn't Chevy's fault. It was the driver's responsibility to know that the car wasn't designed for those conditions. And if a driver doesn't bother to RTFM and read the warning placards and know the limits of the cars "Autopilot" then the onus is on him.

    My dad used to talk about an accident that happened when he lived in California. This was in the late '50s or early 60s. This guy got a full sized van with Cruise Control. He got out on the highway, turned on cruise, and then went into the back of the van to get something. The van of course ran off the road, flipped multiple times, the driver was killed and the van destroyed. Apparently the guy though that Cruise Control was a fully autonomies autopilot. Accident investigators did not blame the van, or the guys employer. they blamed the driver who was an idiot. 

    The guy driving this car was an idiot and all the blame sits squarely on him.
    MplsPrunt888SpamSandwichFileMakerFeller
  • Reply 42 of 57
    MplsPMplsP Posts: 3,925member

    Tesla car with feature improperly marketed as “autopilot”.....  let’s blame Apple. 
    Actually, the "autopilot" branding is consistent with its usage and capabilities in aircraft, which is where the term came from-- it is NEVER intended as autonomous pilotage, in aircraft or cars.  It is an unfortunate thing that people inaccurately equate "autopilot" with "autonomy."

    Pilots NEVER stop flying the plane while autopilot is engaged- they are constantly monitoring systems, forward progress, and traffic outside.  Automobiles demand the same level of attention while on autopilot.

    Every time you turn Autopilot on in a Tesla it tells you to keep your eyes on the road and that you remain responsible for the vehicle's actions.  It is unfortunate that this person decided to do something stupid instead.

    I don’t know if was the case at the time of this accident, but Teslas currently require periodic driver input to ensure that they are paying attention. Also, I haven’t read the manual for a Tesla, but like All-Purpose Guru posted they tell you to pay attention. I suspect this is more an issue of people’s beliefs about Teslas’ capabilities rather than what the company actually states.
  • Reply 43 of 57
    tmaytmay Posts: 6,328member
    https://www.thedrive.com/news/32369/you-dont-own-a-self-driving-car

    "You cannot buy a self-driving car today," he said. "You don’t own a self-driving car, so don’t pretend you do... This means that when driving in the supposed ‘self-driving’ mode, you can’t read a book, you can’t watch a movie or TV show, you can’t text and you can’t play video games."

    [...] “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” he said.

    baconstang
  • Reply 44 of 57
    MorkMork Posts: 22member

    Pilots NEVER stop flying the plane while autopilot is engaged- they are constantly monitoring systems, forward progress, and traffic outside.  Automobiles demand the same level of attention while on autopilot.

    Every time you turn Autopilot on in a Tesla it tells you to keep your eyes on the road and that you remain responsible for the vehicle's actions.  It is unfortunate that this person decided to do something stupid instead.

    Correct,

    As an airline pilot, I have flown numerous aircraft, and  the B747-400 I flew has 3 autopilots, 4 electrical systems, 4 hydraulic systems, all with multiple backups, we are trained to be vigilant and prepared to take over from the automatic systems instantly if required. Remember Air France 447, a simple error of input by a system caused chaos, the relatively inexperienced crew were not trained to recover from such, as the manufacturer thought the events were not possible.

    Tesla states the ‘Autopilot’ must be monitored, blaming either Tesla or, ridiculously, Apple, for this tragic accident  is wrong.

    I have been based in a few different countries, some of which have quite draconian laws, if say Apple is taken to task for not having a policy regarding PED use, especially outside work hours, then we are headed down a dangerous path of extreme over regulation, just like the ridiculous message on the coffee cup stating contents may be hot. Maybe some people will be required to wear a sign: ‘Lying when mouth moving’, not a bad idea.

    People need to take responsibility for their own actions, not play the blame game or demand  government  action.

    My condolences to this person’s family, I’m sure they are looking for answers. Never fully trust automatics. Ask Sarah Conner.


  • Reply 45 of 57
    mknelsonmknelson Posts: 1,125member
    That’s one heck of a reach by the NTSB. “Apple failed to prevent a driver from using bad judgement.”
    You're using quotes for something that isn't a quote.

    The NTSB actually said that Apple doesn't have a policy against employees using a phone while driving. That's different from enforcement.
    tmay
  • Reply 46 of 57
    tmaytmay Posts: 6,328member
    MplsP said:

    Tesla car with feature improperly marketed as “autopilot”.....  let’s blame Apple. 
    Actually, the "autopilot" branding is consistent with its usage and capabilities in aircraft, which is where the term came from-- it is NEVER intended as autonomous pilotage, in aircraft or cars.  It is an unfortunate thing that people inaccurately equate "autopilot" with "autonomy."

    Pilots NEVER stop flying the plane while autopilot is engaged- they are constantly monitoring systems, forward progress, and traffic outside.  Automobiles demand the same level of attention while on autopilot.

    Every time you turn Autopilot on in a Tesla it tells you to keep your eyes on the road and that you remain responsible for the vehicle's actions.  It is unfortunate that this person decided to do something stupid instead.

    I don’t know if was the case at the time of this accident, but Teslas currently require periodic driver input to ensure that they are paying attention. Also, I haven’t read the manual for a Tesla, but like All-Purpose Guru posted they tell you to pay attention. I suspect this is more an issue of people’s beliefs about Teslas’ capabilities rather than what the company actually states.
    I suspect it actually has more to do with Elon's statements, especially on twitter, Tesla sales associates, and Tesla fan sites, than it does with Tesla manuals, or literature, or even marketing material. The fact that Tesla vehicles have very weak tracking of driver's attention (no eye tracking as an example), and that there are devices on the market to help defeat even the small wheel movements that are the basis of attention tracking, is what is going to put Tesla in legal jeopardy.
    edited February 2020
  • Reply 47 of 57
    peteopeteo Posts: 402member
    • tmay said:
    • he fact that Tesla vehicles have very weak tracking of driver's attention (no eye tracking as an example), and that there are devices on the market to help defeat even the small wheel movements that are the basis of attention tracking, is what is going to put Tesla in legal jeopardy.
    I thought you had to keep your hands on the wheel or the car would disengage the auto pilot. 
  • Reply 48 of 57
    tmaytmay Posts: 6,328member
    peteo said:
    • tmay said:
    • he fact that Tesla vehicles have very weak tracking of driver's attention (no eye tracking as an example), and that there are devices on the market to help defeat even the small wheel movements that are the basis of attention tracking, is what is going to put Tesla in legal jeopardy.
    I thought you had to keep your hands on the wheel or the car would disengage the auto pilot. 
    That's how it is supposed to function, but I've seen enough video's of Tesla driver's hand's off, or asleep, to believe that Tesla's steering wheel torque input isn't either effective, or it is being defeated by third party devices which are available to randomly torque the steering wheel. The fact that the driver in this crash was hands off is a likely indicator, but the NTSB actually has the data showing times of hands off.

    "The NTSB said that while data couldn't determine if Huang was holding his phone just before the accident happened, “the Tesla car log data showed that there was no driver-applying steering wheel torque, indicating the hands were likely off the steering wheel during this time period.”

    https://www.mercurynews.com/2020/02/25/tesla-driver-was-playing-video-game-at-time-of-fatal-crash-ntsb/

    This is why Tesla will likely be in legal jeopardy for some accidents where the driver was found to be inattentive; Tesla's Autopilot attention system is technically weak, where many other similar systems employ more expensive eye tracking.
  • Reply 49 of 57
    Amazing, if you are using Autopilot on an aircraft there is always someone monitoring the systems. Why would a car be any different?

    Apple could push out a profile to enforce "Do Not Disturb While Driving" but I haven't looked if that's an option.
  • Reply 50 of 57
    mknelson said:
    That’s one heck of a reach by the NTSB. “Apple failed to prevent a driver from using bad judgement.”
    You're using quotes for something that isn't a quote.

    The NTSB actually said that Apple doesn't have a policy against employees using a phone while driving. That's different from enforcement.
    I wasn’t provide a real quote. I was summarizing the attitude of their comments. And the second line of your comment is completely outside the bounds of my own comment.
  • Reply 51 of 57
    Rayz2016Rayz2016 Posts: 6,957member
    gatorguy said:
    DAalseth said:
    IPhones have a driving mode where they limit incoming messages and usage. Anything more then that, such as defeating the mode and playing a game while he’s supposed to be driving the damn car is the drivers fault. Maybe they want iPhones to shut themselves OFF when in a moving vehicle?
    Last sentence:
    Apple's has incorporated such capabilities into its iOS mobile operating system with a "Do Not Disturb While Driving" option, but the feature is disabled by default.
    Mmm. Mine wasn’t.  The phone came with the feature turned on. 

    Does anyone know (not you obviously) if this is something they set depending on the country where the phone is sold?

    In any event, any publication that wants to get noticed simply dings Apple in an article. This is more of the same. 

    Sorry the chap died, but he was supposed to be looking where he was going. 
    edited February 2020 jdb8167
  • Reply 52 of 57
    The comments made don't understand at all what NTSB is referring to. It has nothing to do with the iPhone and nothing to do with technology.

    Every company has to have Occupational Health and Safety procedures. If you work in an office based environments, one of the biggest risks for the employee is actually the commute to and from work. From an OHS point, responsibility for an employee starts when they leave home and ends when they arrive back home. This is not a legal responsibility but an OHS best practice. Companies with a strong safety culture, for example big multinationals like ExxonMobil, recognise that and in their internal health and safety training of their employees includes the commute to and from work. My previous company had a ban on calls while driving, even on hands-free, mandated minimum safety features in company owned cars (which was a problem in India) and all employees who drove on average more than 20 hours/month for business related purposes needed to go for safe driving courses every 2 years.

    What the NTSB comments on is that commute to and from work is not included in OSH risk assessment for employees in the Apple OHS system. And that is a very valid feedback and I am 100% sure that the feedback is being implemented.

    The fact that a lawyer will most likely exploit this for a lawsuit is a legal problem and not an NTSB problem.





    FileMakerFeller
  • Reply 53 of 57
    @Angmoh:
    I am sympathetic to your argument, however as you acknowledge it's not a legal requirement for Apple to have its OHS policy mandating a particular set of activities. Apple has clearly decided that the existing laws cover the necessary knowledge that employees need to possess when it comes to things such as drug (ab)use, driving motor vehicles and a wide range of other activities that encompass the human condition.

    Apple has a reputation for going all in on something when it thinks it has a better way, and for not wasting the effort when it has determined that the status quo is acceptable.

    The NTSB was grandstanding in this instance, and deserves to face public ridicule for using this particular (inappropriate) forum to advance an idea - whether or not the idea, in isolation, has merit.
  • Reply 54 of 57
    Sumwalt also laid blame on Apple, saying in a statement, "The driver in this crash was employed by Apple — a tech leader. But when it comes to recognizing the need for a company PED policy, Apple is lagging because they don't have such a policy."

    What do performance enhancing drugs have to do with distracted driving?  

    And show me any company in the world that has a policy about how employees drive when they aren't on the clock.  I bet those NTSB employees are governed by no such policy.
    It is mentioned in Chairman Sumwalt's comments (as linked in the article) that the NTSB has a Personal Electronic Devices (PED) policy, one that came into force ten years ago.
  • Reply 55 of 57
    dysamoriadysamoria Posts: 3,430member
    tzeshan said:
    How do you enforce without interfering personal freedom?
    Personal freedom? Irrelevant. This is the operation of a potentially lethal machine. That lethality is related to far more than the operator.

    This kind of autonomous driving feature failure is exactly the kind of stuff I expected when these started coming out. Software is not intelligent and never will be within any of our lifetimes. This is an overreach of insufficient technological capability. 
  • Reply 56 of 57
    GabyGaby Posts: 190member
    I’m sorry but this is ridiculous. What happened to personal responsibility? It’s is neither Apple’s job nor any other company to act as Nanny to any citizen, nor should it be. The more people that are enabled in this way, the less responsible they become for themselves. It’s a really slippery slope. And until such time that Tesla or the authorities explicitly designate self driving features as level 5 and fit for purpose they shouldn’t be held accountable for individuals negligence at the wheel. Frankly even then I think anyone with common sense should be attentive and ready to intervene should the need arise. 
    edited February 2020
  • Reply 57 of 57
    22july201322july2013 Posts: 3,571member
    The NTSB (and all government agencies) will get even more powerful under a Democrat administration which Apple and its staff seem bent on installing.
Sign In or Register to comment.