iPhone 14 won't always detect when you're in a crash
A new test of the Crash Detection feature featured in iPhone 14 models and an Apple Watch shows when it successfully triggers -- and when it does not.

The second vehicle that tested iPhone 14 Crash Detection
In a video from the Wall Street Journal, a demolition derby driver repeatedly drove into two different vehicles. He wore an Apple Watch Ultra with an iPhone 14 model and Google Pixel in the car, which can also detect crashes.
Next, an iPhone 14 and Google Pixel were also placed side-by-side in each vehicle.
Testing showed that the Apple Watch worn by the driver was successful at detecting crashes. However, the iPhone didn't detect any of the impacts inside the first car.
The Pixel did detect one crash inside the driver's vehicle, but neither phone went off inside the car that got crashed into.
With car number two, the testers drove it around before crashing to make the smartphones think it was legitimate. But again, the crash did not trigger the smartphones.
When WSJ contacted Apple and Google with the results, Apple said a lack of data could be why the iPhone 14 didn't detect the crashes. Perhaps it was because it wasn't connected to CarPlay, or the tester didn't drive around the vehicle long enough.
And on September 13, Apple published a video showing how the feature works. It relies on the high dynamic range gyroscope, accelerometer, GPS, barometer, and microphone.
Using that data, the device then uses complex algorithms to detect when a serious car crash has occurred. Crash Detection works on the iPhone 14, iPhone 14 Pro, second-generation Apple Watch SE, Apple Watch Series 8, and the Apple Watch Ultra.
Read on AppleInsider

The second vehicle that tested iPhone 14 Crash Detection
In a video from the Wall Street Journal, a demolition derby driver repeatedly drove into two different vehicles. He wore an Apple Watch Ultra with an iPhone 14 model and Google Pixel in the car, which can also detect crashes.
Next, an iPhone 14 and Google Pixel were also placed side-by-side in each vehicle.
Testing showed that the Apple Watch worn by the driver was successful at detecting crashes. However, the iPhone didn't detect any of the impacts inside the first car.
The Pixel did detect one crash inside the driver's vehicle, but neither phone went off inside the car that got crashed into.
With car number two, the testers drove it around before crashing to make the smartphones think it was legitimate. But again, the crash did not trigger the smartphones.
When WSJ contacted Apple and Google with the results, Apple said a lack of data could be why the iPhone 14 didn't detect the crashes. Perhaps it was because it wasn't connected to CarPlay, or the tester didn't drive around the vehicle long enough.
Other crashes
Another YouTube channel tested the iPhone 14 Crash Detection feature on September 21. During that test, the iPhone did successfully detect the crash.And on September 13, Apple published a video showing how the feature works. It relies on the high dynamic range gyroscope, accelerometer, GPS, barometer, and microphone.
Using that data, the device then uses complex algorithms to detect when a serious car crash has occurred. Crash Detection works on the iPhone 14, iPhone 14 Pro, second-generation Apple Watch SE, Apple Watch Series 8, and the Apple Watch Ultra.
Read on AppleInsider
Comments
My AW catches some of my falls, but mistakes other things for falls, like hitting my hand on something. My dad once crashed an aircraft into trees. Folded the wings back flat, (though he walked away). The Emergency Locator Transmitter mandated and certified by the FAA never went off. That’s the way it is with things made by human hands.
Unless there are testing standards and unambiguous criteria for evaluating marketing speak, running your own “tests” using whatever methods you choose is not going to yield anything more than anecdotal information.
It would be nice to know more details about Apple’s implementation of “crash detection” so we better understand the feature’s inherent limitations. Is it strictly based on accelerometers, audio spectral analysis, or a combination of both? If it uses audio and is listening for glass breakage a demo derby car would be a poor evaluation platform because the glass is always removed from those vehicles. If it is looking for a rapid deceleration spike how does a phone mount or stowage location of the phone in the vehicle impact this detection method? Is it directionally dependent, i.e., front impact vs side impact vs rollover?
Having some insight into a feature’s implementation would help users understand the feature’s limitations so they aren’t placing too much trust in the feature. Users need to recognize that the safety feature may not work in certain situations. Most of these newfangled safety features in cars and phones are pretty good and can be helpful, even lifesaving, but none of them are perfect.
Additionally, knowing the limitations of a feature can help avoid false alarms. For example you may want to take off your Apple Watch off while you’re working on your blacksmithing hobby or hammering nails. I once had a glass breakage sensor that would trigger an alarm whenever I hung pictures within the detection range of the sensor. Lots of people have issues with motion detectors getting triggered by pets.
So the crash detection system, that was designed to detect the sounds you hear INSIDE a crashing car, close proximity breaking glass, crunching metal, air bags going off, plus severe G forces, didn’t experience any of that. They heard and felt the things you’d experience if a crash occurred in front of you, but you weren’t involved.
Big surprise they didn’t register a crash. /s
Take airbags as an example. The software inside an airbag computer uses multiple inputs to decide if an airbag should deploy: vehicle speed, is the seatbelt latched, direction of impact, seat position, throttle, brakes and the most important of all - data programmed into it about the vehicle itself.
We regularly had people filing defect reports with Transport Canada saying their airbags didn’t deploy when they were in a crash. Not sure why since they didn’t suffer anything beyond a very minor injury. Probably looking to blame the manufacturer and get a payout.
Another example: lots of new cars have safety features like automatic braking or lane keeping assist. One customer of a luxury company (can’t give the details) complained that it wasn’t working. They never had an accident, they just thought it should have activated during a close call while driving. They complained because the manufacturer told them there was no way to test the system. Like airbags it relies on numerous inputs and algorithms to make a decision to intervene. The company even has a policy for technicians that prohibits them from testing the system by driving as it would require highly aggressive driving to get the system to intervene. All you can do is scan the vehicle for any faults detected in a sensor/system. No faults means it’s considered functional.
Bottom line: amateurs and YouTubers aren’t in any position to test these systems or draw any conclusions from their tests other than they’re not qualified.
pretty impressive Apple algorithm wasn’t fooled by these fake crashes
these staged crashes are not the real world boss
if so easily replicated, emergency service will be dialed all the tiime
sorry I doubt Apple is that sloppy
From your comment, I can’t tell if you watched the entire video, but your statement is essentially the rigor employed by Stern in these staged crashes.
Quotes below are Stern, transcribed from the video.
“To find out if Apple’s new car crash detection really works, you just have to crash some cars.”
Fair enough. But you might want to understand the limitations of the tests before you jump in.
After a couple crashes, phones and watches in the stunt vehicle successfully began the process of calling emergency services, the phones in the target vehicle did not.
Stern posited that their test was flawed: “See, I figured that there were a few things off with the Taurus, notably that it wasn’t being driven, so the phones didn’t think they were in a car. Plus, the windows were open when the airbags deployed. This time, before the crash, to help the phones recognize they were in a moving vehicle, we drove the van around first.”
She failed to comment about whether any of the phones THEN positively acknowledged being in a moving vehicle. (This is pretty easy to confirm, if you have a driving focus.)
Phones in the stopped vehicle continued to fail to call emergency service. At this point, no additional crashes were attempted, Afterwards, Stern contacted Apple and Google:
Apple: “The testing conditions in the junkyard didn’t provide enough signals to the iPhone to trigger the feature in the stopped cars. It wasn’t connected to Bluetooth or CarPlay which would have indicated that the car was in use, and the vechicles might not have travelled enough distance prior to the crash to indicate driving.”
A Google spokesman said that “The algorithm first needs to detect that you’re driving. Then the feature can detect crashes.”
She wanted to “crash some cars?” Great, she did it. She wanted “To find out if Apple’s new car crash detection really works?” She didn’t REALLY find out, now did she?