Apple TV 4K won't play 4K YouTube videos because of missing Google codec

124

Comments

  • Reply 61 of 86
    While VP9 may be the more open, and less expensive standard for next generation video encoding, it's probably going to lose the battle to HEVC.  HEVC will be the video standard for ATSC 3.0, (in addition to the Ultra HD Blu-Ray mentioned above) which will bring 4K video to over-the-air viewers in the US as well as cable subscribers.  In addition to the hardware decoding of HEVC that you're going to see in the ATSC 3.0 compliant televisions and set-top-boxes, you're also not going to see networks and studios encode their data in multiple formats just for the limited use cases of VP9.

    I'm sure for a while they will compete, just like most dueling standards to, but in the end HEVC will win out, just because it's already part of the standard for next generation video transmission and video disks.

    HEVC and VP9 are both standards.  You really can't blame them for choosing a standard that's being pushed by a number of their competitors.  I've also read a couple of articles that are a little over my head stating the HEVC is the superior standard than VP9 in a couple of different ways technically, though I have no way to argue for or against that point with my limited knowledge and lack of a attributable source.

    If there were real major advantages to VP9 over HEVC besides licensing terms, I think Apple would have jumped on board with it despite the fact that Google is the primary sponsor of the standard.
  • Reply 62 of 86
    alandail said:
    gatorguy said:
    alandail said:
    gatorguy said:
    alandail said:
    netrox said:
    That is incredibly stupid of Apple to refuse to support free open standards.
    Apple supports the industry standard - H.265, and has hardware acceleration to support this standard since the A9 chip
    VP9 is not a standard.

    Google is the one refusing to support the industry standard, not Apple.
    It's not as tho VP9 doesn't have broad-based industry support. ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba and probably a few that missed mention are all supporting partners and VP9 is integrated with Chrome, Firefox, Opera and Edge browsers.  What would have been VP10 has now been contributed to the AV1 project being developed under the auspices of the Alliance for Open Media. That has some of the biggest players in mobile media on board including Adobe, Microsoft, Intel, Netflix, Hulu, the BBC, and Amazon, in addition to Google.

    This whole attempt to portray Google as a lone wolf supporter of open media codecs and an industry outlier is very obviously FUD meant to serve some purpose other than factual. MPEG has worn out their welcome with much of the tech community. With Apple all in with HVEC Advance (which requires licensing from no less than 4 patent pools if I read right) I can see another Betamax vs. VHS on the horizon. With that said I do support Apple's play to replace JPEG with HEIF as long as they include support for AV1.

    EDIT: For information on AV1, the membership and the goals, see here:

    The post I replied to criticized Apple for not implementing the standard. My reply simply said Apple did support the standard.  no FUD, just statement of fact. 

    Google led is the one who is trying to undermine the standard with heir lack of support. 
    Well to be more accurate that would be Google. And Intel. And Microsoft. And Netflix. And AMD. And Hulu. And Adobe. And Samsung. And Nvidia. And Sony. And Firefox. And the BBC. And Amazon.  And Intel. And...
    A good part of that list is required to support the VP9 because of google undermining the industry standard.  Android devices are required to support VP9, Chromecast Ultra doesn't support H.265.  So anyone building android devices and anyone building content that streams to chrome cast ultra has not choice but to support VP9 in addition to their support of the industry standard H.265, which is also the video format for UHD Blu Ray. 

    Again, it's Google, not Apple, who fails to support the industry standard.
    For the most part, this is because these companies don't want to pay 60-100 million a year to the HEVC licensing pools. 

    How much does it cost to use 20% more bandwidth for streaming videos?  From Netflix's own testing, H.265 is 20% more efficient than VP9.  And that was without using the best H.265 encoder.
  • Reply 63 of 86
    KBChicago said:
    tbsteph said:
    Who died and made GOOG the final arbiter of all that is video?
    Google.  When they bought one of the most popular video services--Youtube.
    Arguably, Netflix now carries more video traffic than Google now (mostly because the quality is better, so videos are bigger).
  • Reply 64 of 86
    gatorguy said:
    alandail said:
    netrox said:
    That is incredibly stupid of Apple to refuse to support free open standards.
    Apple supports the industry standard - H.265, and has hardware acceleration to support this standard since the A9 chip
    VP9 is not a standard.

    Google is the one refusing to support the industry standard, not Apple.
    It's not as tho VP9 doesn't have broad-based industry support. ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba and probably a few that missed mention are all supporting partners and VP9 is integrated with Chrome, Firefox, Opera and Edge browsers.  What would have been VP10 has now been contributed to the AV1 project being developed under the auspices of the Alliance for Open Media. That has some of the biggest players in media streaming on board including Microsoft, Intel, Netflix, and Amazon in addition to Google.

    This whole attempt to portray Google as a lone wolf supporter of open media codecs and an industry outlier is very obviously FUD meant to serve some purpose other than factual. MPEG has worn out their welcome with much of the tech community. With Apple all in with HVEC Advance I can see another Betamax vs. VHS on the horizon.
    Standard and support are not the same thing. When you comply with a standard that means your content will be encoded and decoded on every certified hardware and software platform. Support is conjunctural, loose and uncontrolled. You will never know to what extent the supporting brands you mention have implemented that codec, to what extent they support it. Intel disclose it, but the others may never disclose. That means some media will play, some won’t, this is just “open” wilderness as is the case with every “open” codec.
    That's not true - VP9 is royalty-free but the development of the codec has been tightly controlled by Google. 
    So, Google is a “standard”... OK I see your point.
    You are right - VP9 isn't a standard in the sense that it's not coordinated by the ITU.

    But it has a clearly defined bitstream and has been implemented by Intel, AMD and multiple ARM vendors - most new TVs have it, too. Moreover, the source code is open and on github - so anyone use it to write their own encoder and decoder. Looks like a lot of buy-in to me.

    Apple also guides the development of Swift while at the same time open-sourcing it in 2015 to give it more credibility as a programming language that can be taught at universites etc. 

    Vice versa, a standard doesn't imply that different implementations necessarily work the same. The Microsoft Word format is an official ECMA standard but it has been extremely difficult to write viewers that exactly reproduce MS Word output. 

    VP9 and Swift are more de-facto standards than the Office XML format because Google and Apple have a deep interest in steering these projects in a way that guarantees buy-in from third parties. AV1 has even more broad-based support with Intel, AMD, Netflix, Amazon. Microsoft all on board.
    yeh, "open" like Android I suppose...
  • Reply 65 of 86
    gatorguy said:
    alandail said:
    netrox said:
    That is incredibly stupid of Apple to refuse to support free open standards.
    Apple supports the industry standard - H.265, and has hardware acceleration to support this standard since the A9 chip
    VP9 is not a standard.

    Google is the one refusing to support the industry standard, not Apple.
    It's not as tho VP9 doesn't have broad-based industry support. ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba and probably a few that missed mention are all supporting partners and VP9 is integrated with Chrome, Firefox, Opera and Edge browsers.  What would have been VP10 has now been contributed to the AV1 project being developed under the auspices of the Alliance for Open Media. That has some of the biggest players in media streaming on board including Microsoft, Intel, Netflix, and Amazon in addition to Google.

    This whole attempt to portray Google as a lone wolf supporter of open media codecs and an industry outlier is very obviously FUD meant to serve some purpose other than factual. MPEG has worn out their welcome with much of the tech community. With Apple all in with HVEC Advance I can see another Betamax vs. VHS on the horizon.
    Standard and support are not the same thing. When you comply with a standard that means your content will be encoded and decoded on every certified hardware and software platform. Support is conjunctural, loose and uncontrolled. You will never know to what extent the supporting brands you mention have implemented that codec, to what extent they support it. Intel disclose it, but the others may never disclose. That means some media will play, some won’t, this is just “open” wilderness as is the case with every “open” codec.
    That's not true - VP9 is royalty-free but the development of the codec has been tightly controlled by Google. 
    So, Google is a “standard”... OK I see your point.
    You are right - VP9 isn't a standard in the sense that it's not coordinated by the ITU.

    But it has a clearly defined bitstream and has been implemented by Intel, AMD and multiple ARM vendors - most new TVs have it, too. Moreover, the source code is open and on github - so anyone use it to write their own encoder and decoder. Looks like a lot of buy-in to me.

    Apple also guides the development of Swift while at the same time open-sourcing it in 2015 to give it more credibility as a programming language that can be taught at universites etc. 

    Vice versa, a standard doesn't imply that different implementations necessarily work the same. The Microsoft Word format is an official ECMA standard but it has been extremely difficult to write viewers that exactly reproduce MS Word output. 

    VP9 and Swift are more de-facto standards than the Office XML format because Google and Apple have a deep interest in steering these projects in a way that guarantees buy-in from third parties. AV1 has even more broad-based support with Intel, AMD, Netflix, Amazon. Microsoft all on board.
    It seems we’re mixing up concepts here. A standard is a set of definitions, rules, protocols, algorithms/formulas. Standard precedes implementation. “Open source” is related to source, i.e. implementation. Your open source may choose to comply with a standard or not. The “openness” of the code doesn’t make it “standard”. Whether a standard tolerates differences in implementations and the extent of that tolerance is already defined in the standard itself. Differences that doesn’t fit in these definitions/tolerances will just prevent certification. If viewers can’t exactly produce the MS Word output then that means their conformity to the ECMA standard is questionable but not that the ECMA standard itself is questionable. VP9 and Swift have only their “openness” in common, but that openness doesn’t make them “de facto” or not “standards”. Actually Apple doesn’t even care whether Swift is standard or not, it’s openness is needed for the scrutinization of its security model. If a need for the standardization of Swift arises Apple will certainly lead that complying with industrial standardization rules.
  • Reply 66 of 86
    BluntBlunt Posts: 224member
    So i wont't be able to see someone falling over a cat in 4k?
  • Reply 67 of 86
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    edited September 2017
  • Reply 68 of 86
    maestro64 said:
    C'mon, Apple. Do the needed negotiations/due diligence to make sure things play nice with at least the top handful of software partners that your users really use, before releasing a major software update.

    Btw, in my organization, people have been told to not upgrade to iOS11 because it cannot send mail via Outlook/Exchange. Quoting from the memo, "...An error appears stating "Cannot Send Mail. The message was rejected by the server." Apple is working to resolve this, and expects a fix soon. [XYZ] recommends NOT upgrading to this version of iOS at this time."

    Pathetic.


    We seen this problem in the past with outlook and exchange servers not working with new release of iOS thus the reason I usual hold off. The fact that Microsoft is migrating users and companies to 365 for mail could be the issue verses a true exchange issue or something they have done on the 365 side. I can tell in the last week I had lots of serious issue with 365 and had an actual MS tech support person (verse the local support people) on the phone for 3 hours this week trying to resolve my issues. Only to find out it was 365 server side issue they changed something which caused problems with OneDrive for Business to stop syncing properly and corrupting files or not allowing them to save properly. End up having to create an entirely new user profile and setting up my account again.

    Do not be so quick to blame Apple since MS is notorious for blaming everyone else for problem they have on their end. In this case it maybe  be easier for Apple to fix verse MS trying to fix and breaking something else.

    I got a detailed reply from our IT administrator. He said that there is at least one identified case of a person updating to iOS11 working fine, but others were having issues. He pointed me to resources: the official confirmation of this problem by Apple is located at https://support.apple.com/en-us/HT208136 and Microsoft at https://support.microsoft.com/en-us/help/4043473/you-can-t-send-or-reply-from-outlook-com-office-365-or-exchange-2016-i Given all this, he did not feel there was enough confidence in being able to recommend adoption until additional testing was done.

    Most importantly, he noted that the following article may actually serve as a better reference in the context of the discussion: https://www.petri.com/ios-11-exchange-not-kissing-cousins

    Bottom line is that there is enough blame to go around. 

    My original point stands.
    gatorguy
  • Reply 69 of 86
    gatorguygatorguy Posts: 24,213member
    maestro64 said:
    C'mon, Apple. Do the needed negotiations/due diligence to make sure things play nice with at least the top handful of software partners that your users really use, before releasing a major software update.

    Btw, in my organization, people have been told to not upgrade to iOS11 because it cannot send mail via Outlook/Exchange. Quoting from the memo, "...An error appears stating "Cannot Send Mail. The message was rejected by the server." Apple is working to resolve this, and expects a fix soon. [XYZ] recommends NOT upgrading to this version of iOS at this time."

    Pathetic.


    We seen this problem in the past with outlook and exchange servers not working with new release of iOS thus the reason I usual hold off. The fact that Microsoft is migrating users and companies to 365 for mail could be the issue verses a true exchange issue or something they have done on the 365 side. I can tell in the last week I had lots of serious issue with 365 and had an actual MS tech support person (verse the local support people) on the phone for 3 hours this week trying to resolve my issues. Only to find out it was 365 server side issue they changed something which caused problems with OneDrive for Business to stop syncing properly and corrupting files or not allowing them to save properly. End up having to create an entirely new user profile and setting up my account again.

    Do not be so quick to blame Apple since MS is notorious for blaming everyone else for problem they have on their end. In this case it maybe  be easier for Apple to fix verse MS trying to fix and breaking something else.

    I got a detailed reply from our IT administrator. He said that there is at least one identified case of a person updating to iOS11 working fine, but others were having issues. He pointed me to resources: the official confirmation of this problem by Apple is located at https://support.apple.com/en-us/HT208136 and Microsoft at https://support.microsoft.com/en-us/help/4043473/you-can-t-send-or-reply-from-outlook-com-office-365-or-exchange-2016-i Given all this, he did not feel there was enough confidence in being able to recommend adoption until additional testing was done.

    Most importantly, he noted that the following article may actually serve as a better reference in the context of the discussion: https://www.petri.com/ios-11-exchange-not-kissing-cousins

    Bottom line is that there is enough blame to go around. 

    My original point stands.
    Nicely researched.
  • Reply 70 of 86
    Marvin said:

    Actually things changed just a few months later: [...] Netflix changed those statements, stating that by making several changes to its VP9 encoding parameters, these “tunings can reduce or even reverse [the] gap between VP9 and HEVC.”

    http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/The-State-of-Codecs-2017-117269.aspx


    Latest statement from Netflix is: Aaron explained that Netflix currently deployed HEVC primarily on Smart TVs, and saw the codec as integral to its HDR strategy. For computers and mobile, however, H.264 and VP9 are Netflix's primary codecs, and the focus of most of its current research, which will soon include AV1.

    http://www.streamingmediaglobal.com/Articles/Editorial/Featured-Articles/NAB-17-Codec-Roundup-118062.aspx



    gatorguy
  • Reply 71 of 86
    My understanding is that HEVC is much more efficient as a codec. It's also a widely accepted and deployed standard across multiple industries and markets. Google isn't really doing anyone any favors with their own 'standard', and Apple is under no obligation to use it. If the market is there for it, and it materially impacts Apple revenue, then they'll add it. Until then, it's up to Google to convince enough people that their standard is worth using. 
  • Reply 72 of 86
    nhtnht Posts: 4,522member
    mike54 said:
    Google doesn't care about the youtube app on AppleTV. 
    Neither do I.  The only time youtube gets watched is by the kids on iPads in the car.  If google wants to try to sell me paid content for the TV they can support my preferred streamer or I'll happily just stay with Netflix...
  • Reply 73 of 86
    nhtnht Posts: 4,522member
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    The commonly quoted industry numbers for 35mm film negative is around 6K, 70mm around 12K and IMAX around 18K.

    In actual testing, 4 perf 35mm negatives showed 2400x2400 lines of visible resolution...or around 5.76 MP,  inter positives 2100x2100 and release prints 1500x1500.  Using these numbers as a basis for estimation for non-tested formats 70mm is around 43MP.  So it's pretty close to the industry rules of thumb.

    Given that the testing process was viewing line pairs on a filmed MTF chart and determining which is the smallest discernible pair the BS about film grain is simply BS.  

    Ben Hur (1959) was scanned at 8K and remastered in 4K from it's 65mm negative for the blu-ray.  The blu-ray is great and presumably there will be an even better 4K release.
  • Reply 74 of 86
    nht said:
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    The commonly quoted industry numbers for 35mm film negative is around 6K, 70mm around 12K and IMAX around 18K.

    In actual testing, 4 perf 35mm negatives showed 2400x2400 lines of visible resolution...or around 5.76 MP,  inter positives 2100x2100 and release prints 1500x1500.  Using these numbers as a basis for estimation for non-tested formats 70mm is around 43MP.  So it's pretty close to the industry rules of thumb.

    Given that the testing process was viewing line pairs on a filmed MTF chart and determining which is the smallest discernible pair the BS about film grain is simply BS.  

    Ben Hur (1959) was scanned at 8K and remastered in 4K from it's 65mm negative for the blu-ray.  The blu-ray is great and presumably there will be an even better 4K release.
    Ben Hur (1959) exists in iTunes in HD. Yes it is a great scan but it still reveals itself as scan by the low level of details that its 65 mm negative could store. If it was shot originally at 8K there would be a significant difference with the actual scan. Anyway, we'll live with that since we cannot dismiss the whole cinema history and we'll hope that ML based enhancements make some difference.
  • Reply 75 of 86
    maestro64 said:
    C'mon, Apple. Do the needed negotiations/due diligence to make sure things play nice with at least the top handful of software partners that your users really use, before releasing a major software update.

    Btw, in my organization, people have been told to not upgrade to iOS11 because it cannot send mail via Outlook/Exchange. Quoting from the memo, "...An error appears stating "Cannot Send Mail. The message was rejected by the server." Apple is working to resolve this, and expects a fix soon. [XYZ] recommends NOT upgrading to this version of iOS at this time."

    Pathetic.


    We seen this problem in the past with outlook and exchange servers not working with new release of iOS thus the reason I usual hold off. The fact that Microsoft is migrating users and companies to 365 for mail could be the issue verses a true exchange issue or something they have done on the 365 side. I can tell in the last week I had lots of serious issue with 365 and had an actual MS tech support person (verse the local support people) on the phone for 3 hours this week trying to resolve my issues. Only to find out it was 365 server side issue they changed something which caused problems with OneDrive for Business to stop syncing properly and corrupting files or not allowing them to save properly. End up having to create an entirely new user profile and setting up my account again.

    Do not be so quick to blame Apple since MS is notorious for blaming everyone else for problem they have on their end. In this case it maybe  be easier for Apple to fix verse MS trying to fix and breaking something else.

    I got a detailed reply from our IT administrator. He said that there is at least one identified case of a person updating to iOS11 working fine, but others were having issues. He pointed me to resources: the official confirmation of this problem by Apple is located at https://support.apple.com/en-us/HT208136 and Microsoft at https://support.microsoft.com/en-us/help/4043473/you-can-t-send-or-reply-from-outlook-com-office-365-or-exchange-2016-i Given all this, he did not feel there was enough confidence in being able to recommend adoption until additional testing was done.

    Most importantly, he noted that the following article may actually serve as a better reference in the context of the discussion: https://www.petri.com/ios-11-exchange-not-kissing-cousins

    Bottom line is that there is enough blame to go around. 

    My original point stands.
    Microsoft could have tested IOS - Exchange for months... So, no, blame goes to Microsoft for not getting the bug solved before release.

    People blaming Apple for Microsoft's mess is expected though; that's why they like to control everything on their platform (despite some people wanting otherwise).
    Cause that is what people do.
  • Reply 76 of 86
    foggyhill said:
    maestro64 said:
    C'mon, Apple. Do the needed negotiations/due diligence to make sure things play nice with at least the top handful of software partners that your users really use, before releasing a major software update.

    Btw, in my organization, people have been told to not upgrade to iOS11 because it cannot send mail via Outlook/Exchange. Quoting from the memo, "...An error appears stating "Cannot Send Mail. The message was rejected by the server." Apple is working to resolve this, and expects a fix soon. [XYZ] recommends NOT upgrading to this version of iOS at this time."

    Pathetic.


    We seen this problem in the past with outlook and exchange servers not working with new release of iOS thus the reason I usual hold off. The fact that Microsoft is migrating users and companies to 365 for mail could be the issue verses a true exchange issue or something they have done on the 365 side. I can tell in the last week I had lots of serious issue with 365 and had an actual MS tech support person (verse the local support people) on the phone for 3 hours this week trying to resolve my issues. Only to find out it was 365 server side issue they changed something which caused problems with OneDrive for Business to stop syncing properly and corrupting files or not allowing them to save properly. End up having to create an entirely new user profile and setting up my account again.

    Do not be so quick to blame Apple since MS is notorious for blaming everyone else for problem they have on their end. In this case it maybe  be easier for Apple to fix verse MS trying to fix and breaking something else.

    I got a detailed reply from our IT administrator. He said that there is at least one identified case of a person updating to iOS11 working fine, but others were having issues. He pointed me to resources: the official confirmation of this problem by Apple is located at https://support.apple.com/en-us/HT208136 and Microsoft at https://support.microsoft.com/en-us/help/4043473/you-can-t-send-or-reply-from-outlook-com-office-365-or-exchange-2016-i Given all this, he did not feel there was enough confidence in being able to recommend adoption until additional testing was done.

    Most importantly, he noted that the following article may actually serve as a better reference in the context of the discussion: https://www.petri.com/ios-11-exchange-not-kissing-cousins

    Bottom line is that there is enough blame to go around. 

    My original point stands.
    Microsoft could have tested IOS - Exchange for months... So, no, blame goes to Microsoft for not getting the bug solved before release.

    People blaming Apple for Microsoft's mess is expected though; that's why they like to control everything on their platform (despite some people wanting otherwise).
    Cause that is what people do.
    It's not a contest of who's a bigger corporate a$$hole. It's about working with key software partners since, presumably, we can both agree that a company like Apple is supremely interested in its user experience.

    You surely get that, don't you?
  • Reply 77 of 86
    nhtnht Posts: 4,522member
    nht said:
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    The commonly quoted industry numbers for 35mm film negative is around 6K, 70mm around 12K and IMAX around 18K.

    In actual testing, 4 perf 35mm negatives showed 2400x2400 lines of visible resolution...or around 5.76 MP,  inter positives 2100x2100 and release prints 1500x1500.  Using these numbers as a basis for estimation for non-tested formats 70mm is around 43MP.  So it's pretty close to the industry rules of thumb.

    Given that the testing process was viewing line pairs on a filmed MTF chart and determining which is the smallest discernible pair the BS about film grain is simply BS.  

    Ben Hur (1959) was scanned at 8K and remastered in 4K from it's 65mm negative for the blu-ray.  The blu-ray is great and presumably there will be an even better 4K release.
    Ben Hur (1959) exists in iTunes in HD. Yes it is a great scan but it still reveals itself as scan by the low level of details that its 65 mm negative could store. If it was shot originally at 8K there would be a significant difference with the actual scan. Anyway, we'll live with that since we cannot dismiss the whole cinema history and we'll hope that ML based enhancements make some difference.
    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...




      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf



    edited September 2017
  • Reply 78 of 86
    nht said:
    nht said:
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    The commonly quoted industry numbers for 35mm film negative is around 6K, 70mm around 12K and IMAX around 18K.

    In actual testing, 4 perf 35mm negatives showed 2400x2400 lines of visible resolution...or around 5.76 MP,  inter positives 2100x2100 and release prints 1500x1500.  Using these numbers as a basis for estimation for non-tested formats 70mm is around 43MP.  So it's pretty close to the industry rules of thumb.

    Given that the testing process was viewing line pairs on a filmed MTF chart and determining which is the smallest discernible pair the BS about film grain is simply BS.  

    Ben Hur (1959) was scanned at 8K and remastered in 4K from it's 65mm negative for the blu-ray.  The blu-ray is great and presumably there will be an even better 4K release.
    Ben Hur (1959) exists in iTunes in HD. Yes it is a great scan but it still reveals itself as scan by the low level of details that its 65 mm negative could store. If it was shot originally at 8K there would be a significant difference with the actual scan. Anyway, we'll live with that since we cannot dismiss the whole cinema history and we'll hope that ML based enhancements make some difference.
    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...

      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf

    So, those specs are Ben Hur’s specs? Why do you paste haphazardly collected irrelevant non sense here? Those specs prove nothing about Ben Hur. It reveals itself as scan with low detail and anyone with access to iTunes can confirm that. Don’t try to answer my questions, you’re more ignorant than I am, in posting garbage irrelevant to the issue.
  • Reply 79 of 86
    alandail said:
    alandail said:
    gatorguy said:
    alandail said:
    gatorguy said:
    alandail said:
    netrox said:
    That is incredibly stupid of Apple to refuse to support free open standards.
    Apple supports the industry standard - H.265, and has hardware acceleration to support this standard since the A9 chip
    VP9 is not a standard.

    Google is the one refusing to support the industry standard, not Apple.
    It's not as tho VP9 doesn't have broad-based industry support. ARM, Broadcom, Intel, LG, Marvell, MediaTek, Nvidia, Panasonic, Philips, Qualcomm, RealTek, Samsung, Sigma, Sharp, Sony and Toshiba and probably a few that missed mention are all supporting partners and VP9 is integrated with Chrome, Firefox, Opera and Edge browsers.  What would have been VP10 has now been contributed to the AV1 project being developed under the auspices of the Alliance for Open Media. That has some of the biggest players in mobile media on board including Adobe, Microsoft, Intel, Netflix, Hulu, the BBC, and Amazon, in addition to Google.

    This whole attempt to portray Google as a lone wolf supporter of open media codecs and an industry outlier is very obviously FUD meant to serve some purpose other than factual. MPEG has worn out their welcome with much of the tech community. With Apple all in with HVEC Advance (which requires licensing from no less than 4 patent pools if I read right) I can see another Betamax vs. VHS on the horizon. With that said I do support Apple's play to replace JPEG with HEIF as long as they include support for AV1.

    EDIT: For information on AV1, the membership and the goals, see here:

    The post I replied to criticized Apple for not implementing the standard. My reply simply said Apple did support the standard.  no FUD, just statement of fact. 

    Google led is the one who is trying to undermine the standard with heir lack of support. 
    Well to be more accurate that would be Google. And Intel. And Microsoft. And Netflix. And AMD. And Hulu. And Adobe. And Samsung. And Nvidia. And Sony. And Firefox. And the BBC. And Amazon.  And Intel. And...
    A good part of that list is required to support the VP9 because of google undermining the industry standard.  Android devices are required to support VP9, Chromecast Ultra doesn't support H.265.  So anyone building android devices and anyone building content that streams to chrome cast ultra has not choice but to support VP9 in addition to their support of the industry standard H.265, which is also the video format for UHD Blu Ray. 

    Again, it's Google, not Apple, who fails to support the industry standard.
    For the most part, this is because these companies don't want to pay 60-100 million a year to the HEVC licensing pools. 

    How much does it cost to use 20% more bandwidth for streaming videos?  From Netflix's own testing, H.265 is 20% more efficient than VP9.  And that was without using the best H.265 encoder.
    By revealed preference, Netflix must care about about vp9 and av1 enough that they deploy vp9 on many smartphones and that they are founding members of the aomedia alliance that develops av1.
  • Reply 80 of 86
    "Apple TV 4K won't play 4K YouTube videos" There is the only reason i need to not upgrade my Apple TV. I suppose there's a bunch of other apps that can't utilize 4K either. Maybe next year?
Sign In or Register to comment.