Mac with T2 Security Chip required to play 4K Netflix streams in macOS Big Sur

Posted:
in macOS edited September 2020
Those looking forward to streaming Netflix content in 4K HDR with the upcoming macOS Big Sur operating system will need a Mac with Apple's T2 Security Chip, limiting the feature to recently released hardware.

Netflix


Netflix recently updated a Help Center webpage with new guidance on viewing 4K HDR content in Safari on macOS Big Sur, noting both operating system and hardware requirements, reports Apple Terminal.

According to the support document only "[s]elect 2018 or later Mac computer[s] with an Apple T2 Security chip" are compatible with Ultra HD streaming. Further, all external monitors must feature 60Hz 4K capabilities and an HDCP 2.2 compliant connection.

Netflix fails to explain why Macs need a T2 chip to play back 4K HDR streams when comparable Windows machines do not. The chip does integrate a number of critical controllers like the system's image signal processor and audio controller, which have been proven to lend a moderate boost to video encoding. It is possible that Netflix simply added the T2 requirement to ensure subscribers are using modern Macs with current graphics components.

Apple in a Support Document notes HDR video playback is limited to 2018 or later MacBook Pro models, 2018 or later MacBook Air models, the 2020 iMac, iMac Pro, 2018 Mac mini and Mac Pro, with 4K restricted to iMac and iMac Pro variants. T2 chip requirements are not mentioned.

Safari in Big Sur's forthcoming compatibility with Netflix Ultra HD content was first spotted in June, early in the beta testing period. At the time, testers discovered the web browser enables 4K video streams encoded using the HEVC codec.

Apple is expected to debut macOS Big Sur in the coming weeks.
«1

Comments

  • Reply 1 of 28
    The T2 chip is more than just a security chip. It incorporates everything from the System Management Controller (SMC), SSD Controller/Hardware Encryption/Secure Boot  for system storage, ISP (image signal processor), audio controller and Security Enclave. 

    Rayz2016StrangeDayswatto_cobra
  • Reply 2 of 28
    What about eschewing Safari and just use Chrome? I haven’t tried Netflix 4K on macOS but YouTube works at 4K in chrome on Catalina  when safari doesn’t support 4k. 
  • Reply 3 of 28
    chasmchasm Posts: 3,296member
    I note that this is specifically about 4K *HDR* content, so possibly the T2 is handling the image processing on that -- or, as the article posits, Netflix is over-generalizing to ensure that 2018 and later Macs are used when streaming Netflix to those devices.

    Just my own opinion, but I spend plenty of time on my Mac and would rather not continue to sit there to watch a film. I can access services like Netflix on an iPad, on my Apple TV (or other streaming box, if I had one), and on every model I'm aware of Netflix is already built into the smart TV directly. A computer screen is the second-least desireable screen I would use for movie-watching (the first being a smartphone, though they'll do if you're stuck on a flight in economy class or some such).
    F_Kent_Dwatto_cobra
  • Reply 4 of 28
    F_Kent_DF_Kent_D Posts: 98unconfirmed, member
    chasm said:
    I note that this is specifically about 4K *HDR* content, so possibly the T2 is handling the image processing on that -- or, as the article posits, Netflix is over-generalizing to ensure that 2018 and later Macs are used when streaming Netflix to those devices.

    Just my own opinion, but I spend plenty of time on my Mac and would rather not continue to sit there to watch a film. I can access services like Netflix on an iPad, on my Apple TV (or other streaming box, if I had one), and on every model I'm aware of Netflix is already built into the smart TV directly. A computer screen is the second-least desireable screen I would use for movie-watching (the first being a smartphone, though they'll do if you're stuck on a flight in economy class or some such).
    I may be completely incorrect but I believe Apple had to come up with something (A chip) to handle the heavy-hitting processes like these because of Intel’s failure in the chip development allowing their iWhatever processors to process said 4K content without totally overheating and cooking the computer it’s in and all other components without throttling. Hence the whole reason Apple is in the process of switching to their own silicon, Intel cannot handle all of it on their own. My opinion, that’s all.
    watto_cobra
  • Reply 5 of 28
    Stupid things like this is the reason why dumb people make fun of Mac and iOS. 
    muthuk_vanalingamentropyswilliamlondonwatto_cobra
  • Reply 6 of 28
    Wonderful. I am sure people will still continue to rip 4k video streams on computers of their choice.
    cy_starkmanwilliamlondonwatto_cobra
  • Reply 7 of 28
    I would reckon it has something to do with DRM if it involves the T2 chip. Worst chip, ever.
    I wish Apple would get rid of it.  
    cy_starkmanwilliamlondonanonconformist
  • Reply 8 of 28
    sflocalsflocal Posts: 6,093member
    I suppose it's an issue for one that use's their Mac as their sole streaming device.  I prefer to watch it on a regular flatscreen TV (via AppleTV) on a comfy sofa than on an iMac or Macbook.  When I do watch Netflix on my iMac, a always watch it in a window on my 2nd monitor while doing work on the other.  Mainly for background noise.

    I guess some don't?
    williamlondondewmewatto_cobra
  • Reply 9 of 28
    viclauyyc said:
    Stupid things like this is the reason why dumb people make fun of Mac and iOS. 
    Just because you can do something on Windows doesn't mean you will do it well. From my experience of decades of using Windows it can do almost everything but only a small subset of those is doing well. This is why people who's using Windows will get frustrated eventually and turns to Mac (if they have money and not a gamer).
    edited October 2020 muthuk_vanalingamwilliamlondonStrangeDayswatto_cobra
  • Reply 10 of 28
    I would reckon it has something to do with DRM if it involves the T2 chip.  
    my guess is this also, esp since they also require hdcp compliant cables and displays.

    possibly they are trying to prevent content ripping and (in my opinion) are using some data from the T2 chip to encode tracing information into the displayed content. this can then be detected and the person charged.
  • Reply 11 of 28
    Just another reason not to sign up to Netflix (or any other streaming service for that matter). I do have a T2 equipped Mac but like some others have said, I spend far too long in front of it each day (yes, writing novels is work) to want to watch stuff on it when I have a 50in Sony sitting idle. FreeSat (UK) gives me 100+ channels of free TV (no subscriptions) and there is plenty on my HDR to catch up on.
    This Grumpy will carry on refusing subscriptions as long as possible.

    watto_cobra
  • Reply 12 of 28
    Apple's homegrown SMC (System Management Controller) aka T2 is also a hardware video decoder. But Intel's CPU and AMD's GPU are also hardware video decoders.
    edited October 2020 watto_cobra
  • Reply 13 of 28
    Dan_DilgerDan_Dilger Posts: 1,583member
    The T2 chip includes Apple’s hardware accelerated video encoder for video. If Netflix were to “just use Chrome” it could perhaps “do” 4K video in software but it would be lower quality and cause a major implosion of battery life while the fans ran at full tilt as you’re trying to watch Netflix (and would force users to run Chrome.

    Capture a 4K video on your iOS device and then try to play that on a pre-T2 Mac and it will be readily obvious. 

    As we have been noting for some time, T2 was the first step towards Apple Silicon. 
    randominternetpersonStrangeDaysmacpluspluswatto_cobra
  • Reply 14 of 28
    Why would 'HDR' be left out of the headline for this article?
    watto_cobra
  • Reply 15 of 28
    MacProMacPro Posts: 19,727member
    The T2 chip includes Apple’s hardware accelerated video encoder for video. If Netflix were to “just use Chrome” it could perhaps “do” 4K video in software but it would be lower quality and cause a major implosion of battery life while the fans ran at full tilt as you’re trying to watch Netflix (and would force users to run Chrome.

    Capture a 4K video on your iOS device and then try to play that on a pre-T2 Mac and it will be readily obvious. 

    As we have been noting for some time, T2 was the first step towards Apple Silicon. 
    Microsoft's Edge Browser (I wouldn't let Chrome near my Macs) handles 4K on YouTube on a non-T2 iMac 27" i9 5K where Safari in limited to 2K, I wonder if it might be a Safari issue.  After all, most of those things T2 does are handled elsewhere on a non-T2 modern Mac.
    edited October 2020 watto_cobra
  • Reply 16 of 28
    F_Kent_D said:
    chasm said:
    I note that this is specifically about 4K *HDR* content, so possibly the T2 is handling the image processing on that -- or, as the article posits, Netflix is over-generalizing to ensure that 2018 and later Macs are used when streaming Netflix to those devices.

    Just my own opinion, but I spend plenty of time on my Mac and would rather not continue to sit there to watch a film. I can access services like Netflix on an iPad, on my Apple TV (or other streaming box, if I had one), and on every model I'm aware of Netflix is already built into the smart TV directly. A computer screen is the second-least desireable screen I would use for movie-watching (the first being a smartphone, though they'll do if you're stuck on a flight in economy class or some such).
    I may be completely incorrect but I believe Apple had to come up with something (A chip) to handle the heavy-hitting processes like these because of Intel’s failure in the chip development allowing their iWhatever processors to process said 4K content without totally overheating and cooking the computer it’s in and all other components without throttling. Hence the whole reason Apple is in the process of switching to their own silicon, Intel cannot handle all of it on their own. My opinion, that’s all.
    Intel machines running the Windows OS handle Netflix’s 4K HDR content, so it’s not specifically an Intel issue. 
  • Reply 17 of 28
    I would reckon it has something to do with DRM if it involves the T2 chip.  
    my guess is this also, esp since they also require hdcp compliant cables and displays.

    possibly they are trying to prevent content ripping and (in my opinion) are using some data from the T2 chip to encode tracing information into the displayed content. this can then be detected and the person charged.
    Nope. Windows PCs display Netflix 4K HDR content and do not have a T2 chip inside them. 
  • Reply 18 of 28
    I would reckon it has something to do with DRM if it involves the T2 chip. Worst chip, ever.
    I wish Apple would get rid of it.  
    Windows PCs display Netflix 4K HDR content without a T2 chip. 
  • Reply 19 of 28
    I would reckon it has something to do with DRM if it involves the T2 chip. Worst chip, ever.
    I wish Apple would get rid of it.  
    Fascinating.  Do you have a top 10 list of worst chips ever?  How did the Pentium with math errors do?
    watto_cobra
  • Reply 20 of 28
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    This is 100% forward thinking, and about Apple Silicon, as Apple Silicon won't have the assorted TPM modules that the Intel chips do.
    watto_cobra
Sign In or Register to comment.