Apple details headphone jack improvements on new MacBook Pro

Posted:
in Current Mac Hardware
The new 14-inch MacBook Pro and revised 16-inch MacBook Pro feature an improved headphone jack, which Apple says adapts to low- and high-impedance headphones without external amps.

Apple has improved the headphone jack on the new MacBook Pro
Apple has improved the headphone jack on the new MacBook Pro


During the launch of the 14-inch and 16-inch MacBook Pro, Apple referred to how the improved audio features included a revised headphone jack. Now the company has detailed what those revisions are, and how they benefit users.

"Use high-impedance headphones with new MacBook Pro models," is a new support document about the improvements.

"When you connect headphones with an impedance of less than 150 ohms, the headphone jack provides up to 1.25 volts RMS," it says. "For headphones with an impedance of 150 to 1k ohms, the headphone jack delivers 3 volts RMS. This may remove the need for an external headphone amplifier."

As well as this impedance detection and adaptive voltage output, the new 3.5mm headphone jack features "a built-in digital-to-analog converter." Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

Read on AppleInsider

Comments

  • Reply 1 of 17
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    mobirdjony0
  • Reply 2 of 17
    mike1mike1 Posts: 3,440member
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    viclauyycwatto_cobra
  • Reply 3 of 17
    B_TB_T Posts: 4member
    The 3.5mm headphone jack on my ageing MacBook Pro 13 includes an audio line-in. 
    I assume that is not the case here?
    Thank you in advance.
    watto_cobra
  • Reply 4 of 17
    sirdirsirdir Posts: 196member
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    rundhvid
  • Reply 5 of 17
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    MplsPviclauyycwatto_cobra
  • Reply 6 of 17
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    netroxrandominternetpersonrundhvidmacpluspluswatto_cobra
  • Reply 7 of 17
    MplsPMplsP Posts: 4,038member
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Yes, but encoding at close to 5x the maximum frequency should be more than adequate.

    How do they calculate the bit rate? Is the encoder 96kHz or 96 kHz per channel? If it’s 96 kHz per channel it would be 192 kHz ‘combined’ between the two channels. THey also don’t comment on the quality of the built in D-A converter since that can be more important than the actual bit rate.

    If you’re someone for whom the difference between 96 kHz and 192 kHz actually matters, you’re probably better off using a digital output and your own D-A converter.
    randominternetpersonwatto_cobra
  • Reply 8 of 17
    No help for AirPods Max owners even via the Lightning to 3.5mm Audio Cable. My cheaper wired headphones will be able to do lossless. Oh Boy!

    Do these hardware divisions even talk to each other?
  • Reply 9 of 17
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Ah, thanks for the correction. I have middle-aged hearing so it's all moot for me anyway.
    MplsPwatto_cobra
  • Reply 10 of 17
    elijahgelijahg Posts: 2,846member
    ...the new 3.5mm headphone jack features "a built-in digital-to-analog converter." 
    You'll find, William, that "a built-in digital-to-analog converter" is "featured" in all 3.5mm headphone jacks, right back to the Apple //gs. It wouldn't be a headphone jack without an ADC, since human ears can't interpret digital audio too well.

    96kHz isn't of note really either, my 2019 iMac has 96kHz sample rate support both for audio in and audio out - and pretty sure my 2015 MBP has 96kHz too. The notable point is solely that the jack can support high-impedance headphones.

    Come on guys, this is supposed to be a tech blog and you obviously don't understand what you're writing about, this is pretty basic stuff.
    macplusplus
  • Reply 11 of 17
    elijahgelijahg Posts: 2,846member
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz ߑట䭦lt;/div>
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Neither of you are right. It's not the maximum frequency that can be produced, nor is it the encoding bit rate. It's the sample rate. Completely different and entirely unrelated to the encoding bit rate. It's the number of times per second that the audio signal is sampled; sampled meaning a measurement or snapshot of the frequency at that exact moment is taken (or generated in the case of audio out).

    Now where this does relate to human hearing's maximum frequency, is the fact that humans can't generally hear more than 20kHz. Sampling at double that rate means there will be no aliasing errors in the audio - where parts of the audio could be "missed" essentially, as the samples might fall on two sides of a frequency peak. This is known as the Nyquist rate. The sound between the samples is effectively interpolated (averaged), and of course the higher sample rates mean there's less averaging going on. Audiophiles claim they can hear this, but double blind tests have shown that almost no one can actually tell the difference. And the Nyquist rate says 44kHz is plenty high enough to accurately reconstruct a 20kHz signal, proving that high sample rates are pointless.

    The bit rate is inversely related to how much of the original audio is thrown away, and how much the MP3/AAC/whatever decoder has to "guess" to reconstruct the audio.
    edited October 2021 IreneWmuthuk_vanalingamrundhvidmacplusplusPShimijony0
  • Reply 12 of 17
    B_T said:
    The 3.5mm headphone jack on my ageing MacBook Pro 13 includes an audio line-in. 
    I assume that is not the case here?
    Thank you in advance.
    On that assumption, you couldn't use a wired headset or Apple wired EarPods. That would be surprising.
    watto_cobra
  • Reply 13 of 17
    elijahg said:
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz ߑట䭦lt;/div>
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Neither of you are right. It's not the maximum frequency that can be produced, nor is it the encoding bit rate. It's the sample rate. Completely different and entirely unrelated to the encoding bit rate. It's the number of times per second that the audio signal is sampled; sampled meaning a measurement or snapshot of the frequency at that exact moment is taken (or generated in the case of audio out).

    Now where this does relate to human hearing's maximum frequency, is the fact that humans can't generally hear more than 20kHz. Sampling at double that rate means there will be no aliasing errors in the audio - where parts of the audio could be "missed" essentially, as the samples might fall on two sides of a frequency peak. This is known as the Nyquist rate. The sound between the samples is effectively interpolated (averaged), and of course the higher sample rates mean there's less averaging going on. Audiophiles claim they can hear this, but double blind tests have shown that almost no one can actually tell the difference. And the Nyquist rate says 44kHz is plenty high enough to accurately reconstruct a 20kHz signal, proving that high sample rates are pointless.

    The bit rate is inversely related to how much of the original audio is thrown away, and how much the MP3/AAC/whatever decoder has to "guess" to reconstruct the audio.
    Excellent written explanation 👍👍👍

    Regarding audio quality and this 192 kHz sample rate: earlier this year,  announced immediate availability of ’s music catalog in 192 kHz/24 bit Hi-Res Lossless format (although limited to a subset of the catalog at first)—at no extra cost!!

    What is mind-boggling is that ’s hardware is limited to 96 kHz—why?
    —my antique +20 year old Denon AV-receiver happily supports uncompressed multi-channel audio in 192 kHz, but neither my  TV 4K 2nd gen., nor my Mac mini M1 is able to take advantage of  Music’s reference-class format! AFAIK, there is no technical reason for this HDMI-output buzz kill 🤒
    elijahgjony0
  • Reply 14 of 17
    elijahgelijahg Posts: 2,846member
    rundhvid said:
    elijahg said:
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz ߑట䭦lt;/div>
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Neither of you are right. It's not the maximum frequency that can be produced, nor is it the encoding bit rate. It's the sample rate. Completely different and entirely unrelated to the encoding bit rate. It's the number of times per second that the audio signal is sampled; sampled meaning a measurement or snapshot of the frequency at that exact moment is taken (or generated in the case of audio out).

    Now where this does relate to human hearing's maximum frequency, is the fact that humans can't generally hear more than 20kHz. Sampling at double that rate means there will be no aliasing errors in the audio - where parts of the audio could be "missed" essentially, as the samples might fall on two sides of a frequency peak. This is known as the Nyquist rate. The sound between the samples is effectively interpolated (averaged), and of course the higher sample rates mean there's less averaging going on. Audiophiles claim they can hear this, but double blind tests have shown that almost no one can actually tell the difference. And the Nyquist rate says 44kHz is plenty high enough to accurately reconstruct a 20kHz signal, proving that high sample rates are pointless.

    The bit rate is inversely related to how much of the original audio is thrown away, and how much the MP3/AAC/whatever decoder has to "guess" to reconstruct the audio.
    Excellent written explanation 👍👍👍

    Regarding audio quality and this 192 kHz sample rate: earlier this year,  announced immediate availability of ’s music catalog in 192 kHz/24 bit Hi-Res Lossless format (although limited to a subset of the catalog at first)—at no extra cost!!

    What is mind-boggling is that ’s hardware is limited to 96 kHz—why?
    —my antique +20 year old Denon AV-receiver happily supports uncompressed multi-channel audio in 192 kHz, but neither my  TV 4K 2nd gen., nor my Mac mini M1 is able to take advantage of  Music’s reference-class format! AFAIK, there is no technical reason for this HDMI-output buzz kill 🤒
    Thanks! 

    I assume Apple has quite rightly decided that 96kHz is worth it. A 192kHz DAC (digital to analog converter, I mistakenly wrote ADC in my previous post when I meant DAC) is a lot more expensive for essentially no gain. But sending a few extra bits over the internet for the few that think they can hear the difference has a negligible cost. 

    I forgot to mention the difference between 8/16/24 bit audio above - it is the number of discrete signal levels possible per sample. 8 bit is 256 (2^8) and sounds terrible, 16 bit is 65,536 and sounds fine. 24 bit is 16,800,000. 

    Much like high sample rates though, there is no way someone can actually discern 65,000 discrete audio levels, let alone 17 million. Not being able to discern individual levels is the key to audio sounding identical to the original. 
    rundhvidjony0
  • Reply 15 of 17
    The 'hi-fi' audio spec of 20Hz to 20KHz is there for good reason - it exceeds the human hearing range! The chances are that if you are over 20 you won't hear much at all over around 14KHz. 

    The Nyquist theorem states that the digital sampling frequency must be at least twice the maximum analogue frequency to be reproduced. Because higher analogue frequencies sampled may lead to 'aliasing', i.e. lower frequency artefacts. a low-pass filter must be used at source. Hence 44.1 KHz was chosen for the CD standard - enough to reproduce audio up to 20KHz with a little room for (rather harsh) low-pass filters. 

    Higher sampling rates are used for audiophile applications 1) in order to use less harsh low-pass filters when recording and 2) because of the timing difference between the ears. At 20Khz, the wavelength of sound is around 1.6cm. Although you may not be able to hear 20KHz, your brain can certainly determine a 1.6cm phase difference from a source - at all frequencies. Hence high-resolution audio is exactly that - not for the frequency range but for for the timing to the ears. 

    24 bit is much like HDR. You need to listen to it at very high volume in a quiet room to appreciate any difference from 16 bit just as you need a display capable of great brightness in a dim room to appreciate it over SDR. Then no microphone is capable of 24 bit (144dB) signal-to-noise ratio so any audio created to take full advantage of the extra dynamic range is going to be somewhat contrived and artificial - much like HDR movies! 
    rundhvidjony0
  • Reply 16 of 17
    19831983 Posts: 1,225member
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz 👀🤭
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    Actually only children can hear up to 20kHz in most cases. Adults usually can only hear up to about 14kHz I believe. 
    watto_cobra
  • Reply 17 of 17
    sphericspheric Posts: 2,703member
    rundhvid said:
    elijahg said:
    sirdir said:
    mike1 said:
    rundhvid said:
    Apple says this supports up to 96kHz, and means users "can enjoy high-fidelity, full-resolution audio."

    —except ’s own Hi-Res Lossless in 192 kHz ߑట䭦lt;/div>
    Soooo???? You're saying they therefore shouldn't have improved it all then?

    Probably that you can't call something 'full resolution' if you yourself deliver a much higher resolution. 
    The average human can detect sound in the 20Hz to 20 kHZ.  96kHz is way outside the range of human hearing.
    96kHz is not upper frequency response - it is the encoding bite rate - higher = better resolution but 192kHz is more than necessary to do the job but that's audiophiles for you
    Neither of you are right. It's not the maximum frequency that can be produced, nor is it the encoding bit rate. It's the sample rate. Completely different and entirely unrelated to the encoding bit rate. It's the number of times per second that the audio signal is sampled; sampled meaning a measurement or snapshot of the frequency at that exact moment is taken (or generated in the case of audio out).

    Now where this does relate to human hearing's maximum frequency, is the fact that humans can't generally hear more than 20kHz. Sampling at double that rate means there will be no aliasing errors in the audio - where parts of the audio could be "missed" essentially, as the samples might fall on two sides of a frequency peak. This is known as the Nyquist rate. The sound between the samples is effectively interpolated (averaged), and of course the higher sample rates mean there's less averaging going on. Audiophiles claim they can hear this, but double blind tests have shown that almost no one can actually tell the difference. And the Nyquist rate says 44kHz is plenty high enough to accurately reconstruct a 20kHz signal, proving that high sample rates are pointless.

    The bit rate is inversely related to how much of the original audio is thrown away, and how much the MP3/AAC/whatever decoder has to "guess" to reconstruct the audio.
    Excellent written explanation 👍👍👍

    Regarding audio quality and this 192 kHz sample rate: earlier this year,  announced immediate availability of ’s music catalog in 192 kHz/24 bit Hi-Res Lossless format (although limited to a subset of the catalog at first)—at no extra cost!!

    What is mind-boggling is that ’s hardware is limited to 96 kHz—why?
    —my antique +20 year old Denon AV-receiver happily supports uncompressed multi-channel audio in 192 kHz, but neither my  TV 4K 2nd gen., nor my Mac mini M1 is able to take advantage of  Music’s reference-class format! AFAIK, there is no technical reason for this HDMI-output buzz kill 🤒
    As an audio engineer (among other things):

    There is no practical benefit whatsoever from going above 96kHz. That sampling rate can accurately reproduce any signal up to 48 kHz. 

    Only some percussion produces audio up to that range, and virtually no microphones go beyond about 20 kHz. 

    Our hearing tops out at around 16 kHz at birth, dropping dramatically from there over time. 

    In fact, unless you are talking about super high-end converters, going to extremely high sample rates is likely to introduce ADDITIONAL intermodulation distortion that can affect the high frequencies. All double-blind tests where people have been able to distinguish super-high sample rates were cases where people were actually hearing distortion. 

    In the case of a built-in off-the-shelf $1.50 (if that) logic board D/A converter, going to 192kHz is almost *certainly* going to sound worse than staying with the already-overkill 96 kHz. 

    Even ignoring the fact that absolutely NOBODY with playback equipment capable of reproducing that resolution is going to be playing it from a 3.5mm minijack output on his laptop. 

    That’s just ridiculous. 
    watto_cobrajony0
Sign In or Register to comment.