Apple says there's no single 'silver bullet' behind crash detection
Following issues like rollercoasters triggering the new iPhone 14 crash detection, Apple executives have been revealing more about how it works.

There have been real crashes, as well as the erroneous rollercoaster reports, plus independent testing has shown that crash detection won't always work. Now two of Apple's executives have told TechCrunch how crash detection works, and so why there can be these failures, or false positives.
"It's mostly the G Force detection [of the new gyroscope and accelerometer," said Kaiann Drance, Apple's vice president of Worldwide iPhone Product Marketing. "It's able to detect G Force up to 256 Gs. That was one of the key differences for the new accelerometers that the new watches and phones have."
Vice president, Sensing & Connectivity, Ron Huang, said: "It started off with our fundamental understanding of what is experienced during a crash."
"In these crashes, you see impact forces over 100[Gs]," continued Huang. "We started around 256. Any time you try to increase that range, there are trade-offs, in terms of precision at the higher range and the power costs. It took the team a lot of work to build the sensors in this way."
But beyond the improved gyroscope and accelerometer, Apple says that there are many other sensors that are involved. It's the combination of information from the sensors that ultimately triggers the crash detection feature.
While Apple did not detail all of the different sensors, Huang stressed that it has to be a combination effort -- and that which sensors are combined will change depending on the situation.
"There's no silver bullet, in terms of activating crash detection," he explains. "It's hard to say how many of these things have to trigger, because it's not a straight equation."
"Depending how fast the traveling speed was earlier, determines what signals we have to see later on, as well," continued Huang. "Your speed change, combined with the impact force, combined with the pressure change, combined with the sound level, it's all a pretty dynamic algorithm."
Huang and Drance said that, for instance, barometric pressure changes in a crash, but if a car's windows are open, the change is too small to be a significant factor. Equally, there are many ways of detecting that you're driving, such as how Wi-Fi routers will change "very rapidly -- faster than if you're walking or biking."
Drance says that the company does not want "to be doing a lot of false calls to 911 when they're not necessary." She reports having had a minor collision that did not prompt the crash detection feature.
"My crash detection did not go off, because it's just one of those minor things where you just get out of your car and keep going," she said. "That's part of the sensor fusion and accuracy."
Huang and Drance also explained more about what happens when crash detection is triggered and the iPhone or Apple Watch tries to call for help. They first "attempt to dial" over the owner's network, then if that's not available, "we will try to route to any other available carrier."

Kaiann Dranc, Apple's vice president, Worldwide iPhone Product Marketing.
Neither Apple executive would be drawn on the differences in accuracy between the iPhone 14 and the Apple Watch Series 8, but does acknowledge that they are not the same.
"There are differences," said Huang. "[Apple] Watch is on your wrist, and the kind of impact you see on your wrist during a crash will be very different."
"There are those differences, but, for example, barometer is very similar with the iPhone and Watch," he continued. "So there are differences based on how the devices are used, placed or worn."
Apple has previously released an overview video explaining how crash detection works.
Read on AppleInsider

There have been real crashes, as well as the erroneous rollercoaster reports, plus independent testing has shown that crash detection won't always work. Now two of Apple's executives have told TechCrunch how crash detection works, and so why there can be these failures, or false positives.
"It's mostly the G Force detection [of the new gyroscope and accelerometer," said Kaiann Drance, Apple's vice president of Worldwide iPhone Product Marketing. "It's able to detect G Force up to 256 Gs. That was one of the key differences for the new accelerometers that the new watches and phones have."
Vice president, Sensing & Connectivity, Ron Huang, said: "It started off with our fundamental understanding of what is experienced during a crash."
"In these crashes, you see impact forces over 100[Gs]," continued Huang. "We started around 256. Any time you try to increase that range, there are trade-offs, in terms of precision at the higher range and the power costs. It took the team a lot of work to build the sensors in this way."
But beyond the improved gyroscope and accelerometer, Apple says that there are many other sensors that are involved. It's the combination of information from the sensors that ultimately triggers the crash detection feature.
While Apple did not detail all of the different sensors, Huang stressed that it has to be a combination effort -- and that which sensors are combined will change depending on the situation.
"There's no silver bullet, in terms of activating crash detection," he explains. "It's hard to say how many of these things have to trigger, because it's not a straight equation."
"Depending how fast the traveling speed was earlier, determines what signals we have to see later on, as well," continued Huang. "Your speed change, combined with the impact force, combined with the pressure change, combined with the sound level, it's all a pretty dynamic algorithm."
Huang and Drance said that, for instance, barometric pressure changes in a crash, but if a car's windows are open, the change is too small to be a significant factor. Equally, there are many ways of detecting that you're driving, such as how Wi-Fi routers will change "very rapidly -- faster than if you're walking or biking."
Drance says that the company does not want "to be doing a lot of false calls to 911 when they're not necessary." She reports having had a minor collision that did not prompt the crash detection feature.
"My crash detection did not go off, because it's just one of those minor things where you just get out of your car and keep going," she said. "That's part of the sensor fusion and accuracy."
Huang and Drance also explained more about what happens when crash detection is triggered and the iPhone or Apple Watch tries to call for help. They first "attempt to dial" over the owner's network, then if that's not available, "we will try to route to any other available carrier."

Kaiann Dranc, Apple's vice president, Worldwide iPhone Product Marketing.
Neither Apple executive would be drawn on the differences in accuracy between the iPhone 14 and the Apple Watch Series 8, but does acknowledge that they are not the same.
"There are differences," said Huang. "[Apple] Watch is on your wrist, and the kind of impact you see on your wrist during a crash will be very different."
"There are those differences, but, for example, barometer is very similar with the iPhone and Watch," he continued. "So there are differences based on how the devices are used, placed or worn."
Apple has previously released an overview video explaining how crash detection works.
Read on AppleInsider
Comments
You're right. My bad. I misread. My eyes saw "location of road" and I read "local road". I can still edit my post. Thanks.
Devices are not going to be strapped to the seat the way the driver of a road car is, so there will be some extra factors to take into account, but the team at Apple should probably talk to a number of motor racing people. F1 cars are most likely to have a better sensor suite, but I'm sure there's a huge corpus of data from the last 20 years across several formulae that has been collected by the racing teams and by the safety and medical teams. Working with car manufacturers wouldn't hurt, either: maybe the on-board electronics can be pinged for impact-related events like the airbags deploying.
There's a huge amount that can be detected from the environment at this stage, so I expect rapid improvement of this feature. A very simple heuristic would be to listen for screeching tyres, determine if the momentum of the device has changed (especially if the device is now motionless), for a phone you might be able to detect if it's been dropped (a severe crash might knock the phone from its mount)... no single measurement will determine if a crash has occurred, but I anticipate that it will be a rather small set of measurements once the refinement has happened.
Take the track that a coaster runs on. It’s designed to be very smooth so that the cars move freely. If you have a slight imperfection (say a dent or offset in the track) the cars could be subjected to a very high g-force/direction change for a split second while the car passes.
Have you ever ridden a subway or passenger train when they change tracks? The train car rocks side-to-side from the horizontal change in direction. Imagine that at a much higher speed.
I bet that while the park owners find it amusing their coasters triggered a “car crash” alert, the ride engineers are probably thinking of running a few tests on the track to make sure everything is within spec.
I did some research on the Pixels crash detection (in use since 2019 on Pixel 3 and up) and find several instances of 911 activation and saved lives but not many reports of accidental activation over the years. I'd be pretty sure Google has had to tweak their algorithm too since the feature rolled out as there's a fine line between registration of a legit accident with likely injuries and over-aggressive reporting of an accident that didn't happen. Give it a few months and I bet roller-coasters won't still be triggering it on your new iPhone 4 either.
- They’re not Apple and don’t generate traffic with clickbait headlines. Basically nobody cares.
- They sell a tiny fraction of devices compared to the iPhone so even false triggers at the same ratio would represent far fewer total reports.
Most of the transient noise classification studies that I've seen in other domains like antisubmarine warfare and gunfire detection have incorporated at least some sort of training or learning model involving both sources that are considered positives and those that are false positives. If Apple is continuously capturing a snapshot of the pre-triggered sensor data they can potentially improve the accuracy and reduce the false alarm rate of the feature as more incidents occur.