Apple's fix for shaking iPhone 14 Pro cameras arriving next week
Apple will be issuing an update to iOS 16 to fix a problem with the iPhone 14 Pro cameras, one that causes them to shake when the user tries to use a third-party camera app.
On Sunday, reports circulated concerning an issue with the iPhone 14 Pro and iPhone 14 Pro Max cameras, one where third-party apps using the camera would generate blurry and shaking footage. Apple now says it plans to address the issue.
In a tweet from Mark Gurman for Bloomberg, Apple said it had identified there was a problem affecting the Pro models, and that a "software update will be released next week." Apple acknowledged that it had impacted third-party apps that use the camera.
When using some third-party apps, including Instagram, Snapchat, and TikTok, some users discovered the camera module started to shake when recording video. The issue manifested in hardware, with visible movements and audible motors alerting users to something going wrong with their smartphones.
As the standard camera app wasn't affected by the issue, it seems that the jitters could be solved by software, or at least a change that will affect all apps that use the camera module and not just Apple's own.
While there is no official explanation for the shaking, the vibration and motor sounds indicate it could be to do with built-in optical image stabilization systems. For the new 48-megapixel Main camera, Apple uses a second-generation Sensor Shift OIS, which moves the sensor around inside the camera bump.
Read on AppleInsider
On Sunday, reports circulated concerning an issue with the iPhone 14 Pro and iPhone 14 Pro Max cameras, one where third-party apps using the camera would generate blurry and shaking footage. Apple now says it plans to address the issue.
In a tweet from Mark Gurman for Bloomberg, Apple said it had identified there was a problem affecting the Pro models, and that a "software update will be released next week." Apple acknowledged that it had impacted third-party apps that use the camera.
When using some third-party apps, including Instagram, Snapchat, and TikTok, some users discovered the camera module started to shake when recording video. The issue manifested in hardware, with visible movements and audible motors alerting users to something going wrong with their smartphones.
As the standard camera app wasn't affected by the issue, it seems that the jitters could be solved by software, or at least a change that will affect all apps that use the camera module and not just Apple's own.
While there is no official explanation for the shaking, the vibration and motor sounds indicate it could be to do with built-in optical image stabilization systems. For the new 48-megapixel Main camera, Apple uses a second-generation Sensor Shift OIS, which moves the sensor around inside the camera bump.
Read on AppleInsider
Comments
All software development organizations have invested very heavily in testing over the past several years. They know that they need to increase test coverage and avoid releasing bugs into the field, not just for maintaining their reputation for quality, but because it's so damn expensive to fix bugs that are in the field. One of the techniques they've used to improve testing is to test early and test often - mostly using automated test tools that are less expensive and less manpower intensive.
Unfortunately, there are some features/functions that are not easy to test using automated test tools and techniques, notably those features/functions that require a human tester to manually interact, usually in an in-lab test facility, with the system under test and system-level tests. This camera OIS bug falls into both categories. In addition to a test engineering developing the test cases, a tester has to install at least one of the third party apps on the test system and use it in a scenario that involves the OIS feature. Part of the test scenario would involve making sure the feature activates when it should and does not activate when it shouldn't.
Could Apple find an automated way to run tests that seem to require manual, hands-on human interaction? Sure. They could for example use a test fixture with a shaker table and an automated test script that runs through the various app combinations with OIS. Developing complex test fixtures is usually not cheap and has to be weighed against the cost of doing it manually and frequency of use. Sometimes it makes sense to invest in test fixtures. Sometimes it doesn't.
Automated testing can make the overall test coverage numbers look really good and is especially good for regression testing, i.e., making sure that things that were working normally before changes were made are still working normally after the changes were made. But anything that requires manual testing, in-lab testing, involves new features, involves system level interactions, or is only exposed to the public and majority of third party developers during a new product announcement only days before the release are still high risk and should receive higher levels of scrutiny, regardless of the overall test coverage. Ideally, the resources freed by investing in automated testing should be redeployed to address high risk areas of concern, but this is not always feasible.
In summary, this particular bug getting out into the field is not really a big surprise from a risk or probability standpoint. At least Apple can fix it quickly.