Tesla wants Apple's help to beat Autopilot death lawsuit
Tesla wants Apple to testify in an upcoming wrongful death lawsuit over its Autopilot feature, by proving the Apple engineer behind the wheel was playing games instead of paying attention.
A Tesla
Tesla faces a trial on April 8 over a fatal crash from 2018 involving its Autopilot feature, and whether its software is at fault for a car accident. However, lawyers representing the family of the deceased driver, Apple engineer Wei "Walter" Huang" has accused Apple of working with Tesla to help its case.
A pretrial motion filed earlier this week says Apple was "engaging in a secret discovery work-around to help support Tesla in its defense of the pending case," reports The Verge.
In the U.S. National Transportation Safety Board's initial decision on the crash, it determined that the 2018 crash in Mountain View, California was caused by Autopilot on a Tesla Model X failing to recognize an obstacle, resulting in a 71 mile-per-hour crash into a highway barrier. The initial crash was followed by impacts from two other cars, as well as a fire from a battery breach.
At the time, the NTSB determined that an iPhone used by Huang had a strategy game as its frontmost app at the time of the crash. However, log data wasn't enough to determine whether he was actively playing the game during the crash itself.
The logs did indicate there was a "pattern of active game play" that coincided with morning commute hours, while data transmissions before the impact were "consistent with online game activity." The family says that Huang had the game running passively.
Ultimately, the NTSB reported that the data was "not specific enough to ascertain whether the Tesla driver was holding the phone at the time of the crash."
A new declaration
The family is unhappy about a decision by Tesla to submit a declaration from Apple engineer James Harding, stating that Apple had determined Huang was actively playing a game at the time of the crash.
Furthermore, Tesla and Apple are accused of "trying to circumvent the discovery process" by using Harding's word as testimony instead of a deposition. The deposition is also submitted five months after the end of the discovery period, which meant that the family's lawyers couldn't question Harding before the trial itself.
The lawyers have subpoenaed Apple for more information about the declaration. Apple responded in March that the lawyers were seeking "a substantial volume of Apple's privileged material."
In its application to quash the subpoena, Apple said it is not a party to the case, and hadn't received any notice of entry of order for the dispute. "While Apple is ready to work in good faith with the Parties and to fulfill its obligations as a non-party witness, it is very unclear on its present obligations and seeks guidance from the Court," its lawyers wrote.
The declaration could be crucial to Tesla's defense strategy, as the car maker believes that Autopilot is safe to use, and that it only becomes a danger when drivers are not paying attention to the road.
Read on AppleInsider
Comments
999 times out of 1000 i’ll probably side with whoever is accusing tesla of something bad, but this is that 1/1000 for me. dude literally knew better than to trust autopilot there and got yeeted into a wall at 71mph for it.
On the one hand, fuck Tesla and Musk's lies about Autopilot's capabilities, but on the other, autopilot doesn't mean the driver doesn't have to be aware, present, etc. Airplanes have had autopilot for over a century and yet we're still not flying airplanes without pilots in them. Well, except for the drone category and those are controlled by a remote operator. I'm assuming that if a commercial airliner crashed and the NTSB discovered that it was because the pilot was playing games you wouldn't be defending the pilot with a comment like "define 'auto'?"
I love all the automation I have in my car which allow for more relaxed and safer driving, but I understand that I'm still required to have situational awareness with those automated systems in place inside my automobile. Anything less is just asking to be a Darwin Award nominee.
Haa Haa Haa.......
To everyone criticizing Tesla, I suggest you take a minute to read the manual (and the on screen warnings) for autopilot. They are very clear that the system requires supervision. The system also has various safeguards that attempt to ensure attentiveness, although they’ve been strengthened in recent years so they likely weren’t as robust at the time of the accident in question.
The FAA requires far more training and requires pilots to actually exhibit intelligence, common sense and judgment prior to being licensed. As opposed to most ‘licensed’ drivers on the road.
Of course, you'll say that you didn't mean those things and/or that they don't count (somehow), but the absolutely do. They are autonomous systems — denoting or performed by a device capable of operating without direct human control — that have prevented a great deal of damage and minimized the loss of life, even if it means that the younger generations will never understand how and why a brake can lock up and your vehicle to skid without functional control as a result.
You're naming driving assistive features, not driving substitution. Of course we're not going to sleep, but play games instead, texting, making facetime calls, etc. ..Autopilot was always so unneed to intervene ,) It's soooooooo boring to supervise all the time, but having no joy of *driving* ,)
Do you think those autonomous systems that I mentioned are bad for drivers or do you see the benefit of them for road safety? If you think that even one of them is a good then your stated premise is woefully inaccurate.