Apple seeks 911 dispatcher feedback over Emergency SOS skier misfires
Apple is starting to collect information from dispatchers working in 911 call centers about incorrect calls from skiers' iPhones triggered by Emergency SOS and Crash Detection.

Crash Detection
The introduction of the Crash Detection feature for the iPhone 14 and Apple Watch has led to a rise in calls to emergency services, with false positives leading to unnecessary calls and pulling attention from real emergencies.
In December, this became a problem for skiers, with reports that automated crash notifications from devices not actually involved in an emergency still placed calls. Without an answer from skiers when dispatchers attempted to speak to them, ski patrollers had to be dispatched to the location just in case.
Apple is now looking into the situation, according to the New York Post, with a spokesperson confirming the iPhone maker is talking to 911 call centers seeing a spike in automated 911 calls due to the introduction of Crash Detection.
While adding that feedback is being requested by Apple, the spokesperson declined to offer how the feature could be tweaked to reduce the instances of a falsely-detected car accident.
The report reveals the impact on call centers is fairly high in areas covering ski destinations. New York's Greene County 911 center saw a 22T% spike in hang-ups, open lines, and misdialed 911 calls in December 2022 versus 2021.
Jim DiPerna, 911 Communications Director for the county, reckoned there was a 15 to 25% year-on-year increase in calls "that very well could be generated by these Apple-generated and automated crash notifications."
In Pennsylvania's Carbon County Communications Center, there are up to 20 automated crash detection calls a day from local ski areas, which is described by Assistant 911 Manager Justin Markell as "taxing" for a team that is "already busy enough."
Read on AppleInsider

Crash Detection
The introduction of the Crash Detection feature for the iPhone 14 and Apple Watch has led to a rise in calls to emergency services, with false positives leading to unnecessary calls and pulling attention from real emergencies.
In December, this became a problem for skiers, with reports that automated crash notifications from devices not actually involved in an emergency still placed calls. Without an answer from skiers when dispatchers attempted to speak to them, ski patrollers had to be dispatched to the location just in case.
Apple is now looking into the situation, according to the New York Post, with a spokesperson confirming the iPhone maker is talking to 911 call centers seeing a spike in automated 911 calls due to the introduction of Crash Detection.
While adding that feedback is being requested by Apple, the spokesperson declined to offer how the feature could be tweaked to reduce the instances of a falsely-detected car accident.
The report reveals the impact on call centers is fairly high in areas covering ski destinations. New York's Greene County 911 center saw a 22T% spike in hang-ups, open lines, and misdialed 911 calls in December 2022 versus 2021.
Jim DiPerna, 911 Communications Director for the county, reckoned there was a 15 to 25% year-on-year increase in calls "that very well could be generated by these Apple-generated and automated crash notifications."
In Pennsylvania's Carbon County Communications Center, there are up to 20 automated crash detection calls a day from local ski areas, which is described by Assistant 911 Manager Justin Markell as "taxing" for a team that is "already busy enough."
Read on AppleInsider
Comments
Well, duh. They haven’t gotten the data yet. No point on speculating on the fix until you know what the problem is.
“What also floats in water?”
Apple obviously came up with a set of criteria and algorithms that they assumed and verified would work for determination of their “witch,” i.e., occurrence of a crash. But perhaps they fell a bit short in probing the “What also floats in water?” part. Had they probed a little deeper they could have found it necessary to provide further safeguards, processes, or refinement to help reduce the likelihood of an incorrect determination. Or maybe they did fully understood the lack of precision and decided that the human benefits of a less accurate determination far outweighed the costs in time, money, and disruption of limited first responder resources, so they went to market with what they believed was their best solution for their highest level concern - saving people’s lives.
A measured, pragmatic, and biased towards saving lives approach is distinctly different than simply putting something out there and hoping for the best and knowing that feedback from its performance in the field will allow you to refine it over time. I don’t think Apple ever intends for its customers to serve as crash test dummies, even though customers do end up being testers/crash test dummies to some degree anyway because Apple, like everyone, doesn’t know what they don’t know. The line of distinction really comes down to Apple’s intentions and motivations, which are cultural influences over the engineering and product development process but not part of the engineering process itself.
In my opinion, none of the false positives identified thus far in Apple’s deployed crash detection performance seem like radical departures from scenarios that would or should have been considered if Apple probed a tiny bit deeper into the “What also floats in water?” line of inquiry.
What you are describing is how engineers like to portray themselves, and how they are portrayed in the media. In the real world they try to take everything into account, test extensively, and put the best thing out there. But there are ALWAYS things they didn’t expect/didn’t understand. I worked for a couple of decades with several different teams of engineers in different industries, and different kinds (mechanical, electrical, computer, software). No matter how carefully planned and tasted a product was when it was introduced unexpected things would come up and they were always like Hagred; “Huh. Shouldn’t a done that”.
The real world always has the last laugh. This is very typical of a new technology.
Mom was wise.
Someone else, or even me, in exactly the same or similar situation might have been more seriously injured. Thankfully I wasn’t.
A watch can detect a fall/crash, but cannot reasonably detect injury. The crash detect feature has been designed to err on the side of caution, as it should be.
If it leads to false alarms, that’s natural, and users should know to respond to their watches.
The fault with the folk who are not responding “I’m okay” to their watches. Maybe there should be fines for triggering false alarms.
Unless all ducks are witches.