iOS 16.1 won't bring Adaptive Transparency to old AirPods Pro

Posted:
in iOS
Apple's Adaptive Transparency feature of the AirPods Pro 2 won't be retrofitted into the original model, with the inclusion of the feature in an iOS 16.1 beta said to be a mistake.




Apple's third beta of iOS 16.1 unexpectedly included an option within the Settings app to seemingly enable Adaptive Transparency on the first-generation AirPods Pro. However, despite appearing in the build of the operating system, it wasn't a functional element for users.

While it brought hope to some that the feature will work in future for the older personal audio accessories, Bloomberg's Mark Gurman posted on Twitter that his sources say "this is a bug." In clarifying comments on Monday, Gurman adds "It's not meant to work," and probably was only accessible by mistake.

Adaptive Transparency for the second-generation AirPods Pro is an enhancement of the original Transparency mode, which allows some environmental sounds through while reducing the volume of drones and ongoing noises. Adaptive Transparency adds in the automatic quieting of sudden loud noises, while keeping the listener aware of their surroundings.

Apple has not mentioned the possibility of Adaptive Transparency being made available on older hardware, and it is unlikely to make the transition due to the component changes in the newest models.

Read on AppleInsider

Comments

  • Reply 1 of 5
    JP234JP234 Posts: 550member
    If this feature actually works as hyped, it would be a game changer for older people (like me) who don't need Rx hearing aids yet, but have some difficulty distinguishing conversation from ambient noise, and find loud noises actually uncomfortable.
    cg27
  • Reply 2 of 5
    elijahgelijahg Posts: 2,652member

    Apple has not mentioned the possibility of Adaptive Transparency being made available on older hardware, and it is unlikely to make the transition due to an artificial software limitation.
    FTFY.

    When Apple discontinues hardware they often add more features to it. The OG HP has had a few new software features added after it was discontinued, as did the Airport routers. They often claim "new hardware" is needed for X feature when it's clearly a software lock. Stage manager is another that was apparently not possible on older iPads, and suddenly now is. 
    edited October 3
  • Reply 3 of 5
    I tested the iOS 16 beta this summer and my beats fit pro which are the same as original AP pro blocked the sound from hammering on things in transparency mode. Is this basically more of that?
  • Reply 4 of 5
    chasmchasm Posts: 2,579member
    JP234 said:
    If this feature actually works as hyped, it would be a game changer for older people (like me) who don't need Rx hearing aids yet, but have some difficulty distinguishing conversation from ambient noise, and find loud noises actually uncomfortable.
    Just in case you weren’t aware, there are two existing features specifically for people like you. Conversation Boost directs the focus of AirPods Pro to the person in front of you via beam forming (no music needs to be playing). There is also a feature called Live Listen where AirPods Pro (at least, maybe the other models) can use the iPhone’s microphone to aid people with mild hearing loss. For example, in a meeting just put the iPhone on the table closer to the person speaking (or the TV, in another use case), and the sound is relayed to your earbuds via the iPhone’s mic.

    https://appleinsider.com/articles/21/10/06/how-to-use-conversation-boost-with-airpods-pro

    https://support.apple.com/en-us/HT209082
    edited October 4 JP234
  • Reply 5 of 5
    JP234JP234 Posts: 550member
    chasm said:
    JP234 said:
    If this feature actually works as hyped, it would be a game changer for older people (like me) who don't need Rx hearing aids yet, but have some difficulty distinguishing conversation from ambient noise, and find loud noises actually uncomfortable.
    Just in case you weren’t aware, there are two existing features specifically for people like you. Conversation Boost directs the focus of AirPods Pro to the person in front of you via beam forming (no music needs to be playing). There is also a feature called Live Listen where AirPods Pro (at least, maybe the other models) can use the iPhone’s microphone to aid people with mild hearing loss. For example, in a meeting just put the iPhone on the table closer to the person speaking (or the TV, in another use case), and the sound is relayed to your earbuds via the iPhone’s mic.

    I'm well aware of both, and have used both. But neither offers the sophisticated algorithms necessary to effectively discriminate ambient noise from voice. At this point, I've been using my Sony WH-1000MX4 headphones even in noisy public areas like restaurants, airports, concerts and such. The noise cancelling tech is unmatched by any current Apple products, and can be configured to prioritize voice using software along with multiple microphones. It's like being 25 years old again. I use them when watching TV at home, and on airplanes (They completely cancel the drone of the engines. Try that with AirPods!).

    I'm sure the limitations of in-ear vs over-the-ear hardware are the reason for the less than insanely great features of Conversation Boost and Live Listen.
Sign In or Register to comment.