Apple TV 4K won't play 4K YouTube videos because of missing Google codec

1235»

Comments

  • Reply 81 of 86
    nhtnht Posts: 4,522member
    nht said:
    nht said:
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    The commonly quoted industry numbers for 35mm film negative is around 6K, 70mm around 12K and IMAX around 18K.

    In actual testing, 4 perf 35mm negatives showed 2400x2400 lines of visible resolution...or around 5.76 MP,  inter positives 2100x2100 and release prints 1500x1500.  Using these numbers as a basis for estimation for non-tested formats 70mm is around 43MP.  So it's pretty close to the industry rules of thumb.

    Given that the testing process was viewing line pairs on a filmed MTF chart and determining which is the smallest discernible pair the BS about film grain is simply BS.  

    Ben Hur (1959) was scanned at 8K and remastered in 4K from it's 65mm negative for the blu-ray.  The blu-ray is great and presumably there will be an even better 4K release.
    Ben Hur (1959) exists in iTunes in HD. Yes it is a great scan but it still reveals itself as scan by the low level of details that its 65 mm negative could store. If it was shot originally at 8K there would be a significant difference with the actual scan. Anyway, we'll live with that since we cannot dismiss the whole cinema history and we'll hope that ML based enhancements make some difference.
    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...

      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf

    So, those specs are Ben Hur’s specs? Why do you paste haphazardly collected irrelevant non sense here? Those specs prove nothing about Ben Hur. It reveals itself as scan with low detail and anyone with access to iTunes can confirm that. Don’t try to answer my questions, you’re more ignorant than I am, in posting garbage irrelevant to the issue.
    Yes, those numbers will be pretty close for Ben Hur since it was shot on as 65mm p5. The difference will be in the film stock of the period versus the Kodak stock tested and how well the negative was preserved.  Given they have a high quality 8K scan and a 4K rather than 2K DI that was cleaned up and regraded for HD the only think they have to do for a top quality UHD Ben Hur release is regrade for HDR.

    The "granularity of the chemical substance" of film stock allows for up to 80 lp/mm or details of .006mm size.

    It's not "haphazardly collected" but direct refutation of your dumb assertion that older film movies won't make good 4K source material from a top maker of digital and film movie cameras.  Even Super 35mm has enough resolution for 4K because science despite mostly being edited in 2K digital intermediate by film makers. As long as the negatives are still around they can rescan and remaster them in 4K with far more resolution than found on the best blu-ray.

    And if you are watching it on iTunes you aren't watching it at the same bit rate as the blue ray which isn't going to be as good as the 4K master.  There will be a marked improvement when the 4K UHD release is made vs the 50th anniversary BD set.  Frankly, I don't believe you've ever watched Ben Hur in any format if you think the Ben Hur BR is low detail.  

    Do you even understand that many "digital 4K" movies were edited in 2K digital intermediate even if shot on 5K?  And the upconverted 2K DI is what ends up on the 4K stream or UHD blue ray?  Plus all the FX was rendered in 2K?  And it still looks way better than the HD blue ray because it got regraded into HDR and has a high quality upconvert?
  • Reply 82 of 86
    nht said:
    nht said:
    nht said:
    Although this is a thread related to a codec, since that codec involves 4K content, a few remarks about 4K.

    Not every content can be made 4K. For true 4K, the video must be shot at 4K. I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    DVDs are 480p those cannot be made 4K either, unless they possess a 4K digital master. I suppose only a small percentage of DVDs may have 4K digital versions. So we may discard unfortunately the whole DVD epoch too.

    I suppose only the recently shot BluRay movies may have higher resolution digital masters. Why recently shot? Because the previous ones may already be upscaled to be repackaged as BluRay. They may upscale again those as 4K but not everyone may buy this time.

    I am not in the film industry, so a professional’s comments about the resolution of the digital masters will be much appreciated.

    I suppose also that with the help of advanced machine learning techniques, it may be possible to add realistic detail to upscaled videos. Those may be qualified as true 4K then.
    The commonly quoted industry numbers for 35mm film negative is around 6K, 70mm around 12K and IMAX around 18K.

    In actual testing, 4 perf 35mm negatives showed 2400x2400 lines of visible resolution...or around 5.76 MP,  inter positives 2100x2100 and release prints 1500x1500.  Using these numbers as a basis for estimation for non-tested formats 70mm is around 43MP.  So it's pretty close to the industry rules of thumb.

    Given that the testing process was viewing line pairs on a filmed MTF chart and determining which is the smallest discernible pair the BS about film grain is simply BS.  

    Ben Hur (1959) was scanned at 8K and remastered in 4K from it's 65mm negative for the blu-ray.  The blu-ray is great and presumably there will be an even better 4K release.
    Ben Hur (1959) exists in iTunes in HD. Yes it is a great scan but it still reveals itself as scan by the low level of details that its 65 mm negative could store. If it was shot originally at 8K there would be a significant difference with the actual scan. Anyway, we'll live with that since we cannot dismiss the whole cinema history and we'll hope that ML based enhancements make some difference.
    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...

      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf

    So, those specs are Ben Hur’s specs? Why do you paste haphazardly collected irrelevant non sense here? Those specs prove nothing about Ben Hur. It reveals itself as scan with low detail and anyone with access to iTunes can confirm that. Don’t try to answer my questions, you’re more ignorant than I am, in posting garbage irrelevant to the issue.
    Yes, those numbers will be pretty close for Ben Hur since it was shot on as 65mm p5. The difference will be in the film stock of the period versus the Kodak stock tested and how well the negative was preserved.  Given they have a high quality 8K scan and a 4K rather than 2K DI that was cleaned up and regraded for HD the only think they have to do for a top quality UHD Ben Hur release is regrade for HDR.

    The "granularity of the chemical substance" of film stock allows for up to 80 lp/mm or details of .006mm size.
    How do you know that? How can you be sure that all films offer 0.006 mm minimum film grain whilst you are aware of "the difference will be  in the film stock of the period versus the Kodak stock tested and how well the negative was preserved". Your confusion resides in that if today's scanners may offer such high resolutions as 80 lp/mm then all films will provide "granularity" equal or smaller to that. That is what I question and according to what I watch from the '50s I have sound reasons to question that.
    nht said:
    And if you are watching it on iTunes you aren't watching it at the same bit rate as the blue ray which isn't going to be as good as the 4K master.  There will be a marked improvement when the 4K UHD release is made vs the 50th anniversary BD set.  Frankly, I don't believe you've ever watched Ben Hur in any format if you think the Ben Hur BR is low detail.  
    Bitrate doesn't matter in detecting whether the source is a digital recording or a digital scan of the analog film. I have worked several years with the prepress industry, scanned slides are always discernible from digital shots. You can always increase the resolution of the scan. What you can't is increase the granularity of the source slide. On the final print both digital scans and digital shots are visually equivalent. But when working on the screen an experienced eye recognizes a digital scan immediately. And yes, I have watched so many Ben-Hurs in my childhood in the 60s and during my adulthood that I feel free to recognize a bad version.
    nht said:
    Do you even understand that many "digital 4K" movies were edited in 2K digital intermediate even if shot on 5K?  And the upconverted 2K DI is what ends up on the 4K stream or UHD blue ray?  Plus all the FX was rendered in 2K?  And it still looks way better than the HD blue ray because it got regraded into HDR and has a high quality upconvert?
    No I don't understand. What I understand from your reporting is their editing tools does not work with 4K source and they need to downsample it to 2K to edit and resample it to 4K to release. Do not present a compromise (the best term I could find to avoid "fraud") as a feature.
    edited September 2017
  • Reply 83 of 86
    nhtnht Posts: 4,522member
    nht said:
    nht said:

    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...

      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf

    So, those specs are Ben Hur’s specs? Why do you paste haphazardly collected irrelevant non sense here? Those specs prove nothing about Ben Hur. It reveals itself as scan with low detail and anyone with access to iTunes can confirm that. Don’t try to answer my questions, you’re more ignorant than I am, in posting garbage irrelevant to the issue.
    Yes, those numbers will be pretty close for Ben Hur since it was shot on as 65mm p5. The difference will be in the film stock of the period versus the Kodak stock tested and how well the negative was preserved.  Given they have a high quality 8K scan and a 4K rather than 2K DI that was cleaned up and regraded for HD the only think they have to do for a top quality UHD Ben Hur release is regrade for HDR.

    The "granularity of the chemical substance" of film stock allows for up to 80 lp/mm or details of .006mm size.
    How do you know that? How can you be sure that all films offer 0.006 mm minimum film grain whilst you are aware of "the difference will be  in the film stock of the period versus the Kodak stock tested and how well the negative was preserved". Your confusion resides in that if today's scanners may offer such high resolutions as 80 lp/mm then all films will provide "granularity" equal or smaller to that. That is what I question and according to what I watch from the '50s I have sound reasons to question that.
    nht said:
    And if you are watching it on iTunes you aren't watching it at the same bit rate as the blue ray which isn't going to be as good as the 4K master.  There will be a marked improvement when the 4K UHD release is made vs the 50th anniversary BD set.  Frankly, I don't believe you've ever watched Ben Hur in any format if you think the Ben Hur BR is low detail.  
    Bitrate doesn't matter in detecting whether the source is a digital recording or a digital scan of the analog film. I have worked several years with the prepress industry, scanned slides are always discernible from digital shots. You can always increase the resolution of the scan. What you can't is increase the granularity of the source slide. On the final print both digital scans and digital shots are visually equivalent. But when working on the screen an experienced eye recognizes a digital scan immediately. And yes, I have watched so many Ben-Hurs in my childhood in the 60s and during my adulthood that I feel free to recognize a bad version.
    nht said:
    Do you even understand that many "digital 4K" movies were edited in 2K digital intermediate even if shot on 5K?  And the upconverted 2K DI is what ends up on the 4K stream or UHD blue ray?  Plus all the FX was rendered in 2K?  And it still looks way better than the HD blue ray because it got regraded into HDR and has a high quality upconvert?
    No I don't understand. What I understand from your reporting is their editing tools does not work with 4K source and they need to downsample it to 2K to edit and resample it to 4K to release. Do not present a compromise (the best term I could find to avoid "fraud") as a feature.
    The scanner resolution doesn't matter in the test because they looked at the filmed test pattern on the negative under the microscope, not the digital scan.  While that is modern stock even if you assume a fairly large reduction in resolution there's enough data in 65mm p5 negatives for 4K. 

    Unless you can show otherwise the film industry says 65mm=8K, the tests on real world 35mm stock shows sufficient resolution for 4K and all you have done is hand wave with "sound reasons".  If you haven't seen the actual 4K master of Ben Hur you have zero idea of the actual quality and we aren't dealing with stills resolution but film resolution at 24 fps with motion and blur.  Your vaunted experience with slides isn't nearly as meaningful as you think.

    Bitrate matters in the ability to display detail as opposed to artifacts.  Low bitrate HD will show less detail than high bitrate HD.  If it looks like mush on iTunes as you claim (I don't own it on iTunes) and looks great on BR (which I do have) then it's not the source material but the delivered material. 

    Finally, the point of stating much of the 4K available today is upconverted from the 2K DI is to show that 35mm film over scanned at 8K and remastered into 4K DI and regraded to HDR will look very good if not equivalent on UHD BluRay and lower bitrate 4K.  Probably as good or better than the current set of pseudo 4K offerings of contemporary movies...which if you are watching on Netflix is about all the quality you get anyway regardless of the original source material.
  • Reply 84 of 86
    nht said:
    nht said:
    nht said:

    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...

      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf

    So, those specs are Ben Hur’s specs? Why do you paste haphazardly collected irrelevant non sense here? Those specs prove nothing about Ben Hur. It reveals itself as scan with low detail and anyone with access to iTunes can confirm that. Don’t try to answer my questions, you’re more ignorant than I am, in posting garbage irrelevant to the issue.
    Yes, those numbers will be pretty close for Ben Hur since it was shot on as 65mm p5. The difference will be in the film stock of the period versus the Kodak stock tested and how well the negative was preserved.  Given they have a high quality 8K scan and a 4K rather than 2K DI that was cleaned up and regraded for HD the only think they have to do for a top quality UHD Ben Hur release is regrade for HDR.

    The "granularity of the chemical substance" of film stock allows for up to 80 lp/mm or details of .006mm size.
    How do you know that? How can you be sure that all films offer 0.006 mm minimum film grain whilst you are aware of "the difference will be  in the film stock of the period versus the Kodak stock tested and how well the negative was preserved". Your confusion resides in that if today's scanners may offer such high resolutions as 80 lp/mm then all films will provide "granularity" equal or smaller to that. That is what I question and according to what I watch from the '50s I have sound reasons to question that.
    nht said:
    And if you are watching it on iTunes you aren't watching it at the same bit rate as the blue ray which isn't going to be as good as the 4K master.  There will be a marked improvement when the 4K UHD release is made vs the 50th anniversary BD set.  Frankly, I don't believe you've ever watched Ben Hur in any format if you think the Ben Hur BR is low detail.  
    Bitrate doesn't matter in detecting whether the source is a digital recording or a digital scan of the analog film. I have worked several years with the prepress industry, scanned slides are always discernible from digital shots. You can always increase the resolution of the scan. What you can't is increase the granularity of the source slide. On the final print both digital scans and digital shots are visually equivalent. But when working on the screen an experienced eye recognizes a digital scan immediately. And yes, I have watched so many Ben-Hurs in my childhood in the 60s and during my adulthood that I feel free to recognize a bad version.
    nht said:
    Do you even understand that many "digital 4K" movies were edited in 2K digital intermediate even if shot on 5K?  And the upconverted 2K DI is what ends up on the 4K stream or UHD blue ray?  Plus all the FX was rendered in 2K?  And it still looks way better than the HD blue ray because it got regraded into HDR and has a high quality upconvert?
    No I don't understand. What I understand from your reporting is their editing tools does not work with 4K source and they need to downsample it to 2K to edit and resample it to 4K to release. Do not present a compromise (the best term I could find to avoid "fraud") as a feature.
    The scanner resolution doesn't matter in the test because they looked at the filmed test pattern on the negative under the microscope, not the digital scan.  While that is modern stock even if you assume a fairly large reduction in resolution there's enough data in 65mm p5 negatives for 4K. 

    Unless you can show otherwise the film industry says 65mm=8K, the tests on real world 35mm stock shows sufficient resolution for 4K and all you have done is hand wave with "sound reasons".  If you haven't seen the actual 4K master of Ben Hur you have zero idea of the actual quality and we aren't dealing with stills resolution but film resolution at 24 fps with motion and blur.  Your vaunted experience with slides isn't nearly as meaningful as you think.

    Bitrate matters in the ability to display detail as opposed to artifacts.  Low bitrate HD will show less detail than high bitrate HD.  If it looks like mush on iTunes as you claim (I don't own it on iTunes) and looks great on BR (which I do have) then it's not the source material but the delivered material. 

    Finally, the point of stating much of the 4K available today is upconverted from the 2K DI is to show that 35mm film over scanned at 8K and remastered into 4K DI and regraded to HDR will look very good if not equivalent on UHD BluRay and lower bitrate 4K.  Probably as good or better than the current set of pseudo 4K offerings of contemporary movies...which if you are watching on Netflix is about all the quality you get anyway regardless of the original source material.
    OK here is the thing: a digital scan and a digital shot differ significantly. Do you have any objection to that? Then go back to photography basics. Take a shot on the best film with a Pro camera (ignore the size for the moment) then scan this at X resolution. Then take the same shot with a Pro digital camera again at X resolution. Compare the two. You will see that the digital shot is significantly better in details than the best scan you can achieve (media differences about color accuracy, tonal range and alike remain).

    My starting point you're trying to suffocate with all those irrelevant general truths and knowledge showmanship was "If it was shot originally at 8K there would be a significant difference with the actual scan". If you have any test about this then feel free to find it study it again and again. I am done with this discussion.


  • Reply 85 of 86
    nhtnht Posts: 4,522member
    nht said:
    nht said:
    nht said:

    This is false. Don't believe me, believe Arri.

    65mm has 8746x3836 resolution at 5p and more in the larger sizes.

    What you wrote was nonsense.

    --
    Results

    This test is admittedly an ideal case, but the ideal is the goal when testing the limits of image storage in film. In the test, the smallest resolvable detail is 0.006 mm large on the film, whether 35 mm or 16 mm. Thus, across the full film width there are 24.576 mm / 0.006 = 4096 details or points for 35 mm film and 12.35 mm / 0.006 = 2048 points for 16 mm film. These are referred to as points and not pixels because we are still operating in the analog world. These statements depend upon the following:

    1. (1)  looking at the center of the image

    2. (2)  the film sensitivity is not over 250 ASA

    3. (3)  exposure and development are correct

    4. (4)  focus is correct

    5. (5)  lens and film don’t move against one another

      during exposure

    6. (6)  speed <50 frames/sec 

      ...

      http://www.arri.com/fileadmin/media/arri.com/downloads/Camera/Tutorials/SystemsTechnologyBrochure.pdf

    So, those specs are Ben Hur’s specs? Why do you paste haphazardly collected irrelevant non sense here? Those specs prove nothing about Ben Hur. It reveals itself as scan with low detail and anyone with access to iTunes can confirm that. Don’t try to answer my questions, you’re more ignorant than I am, in posting garbage irrelevant to the issue.
    Yes, those numbers will be pretty close for Ben Hur since it was shot on as 65mm p5. The difference will be in the film stock of the period versus the Kodak stock tested and how well the negative was preserved.  Given they have a high quality 8K scan and a 4K rather than 2K DI that was cleaned up and regraded for HD the only think they have to do for a top quality UHD Ben Hur release is regrade for HDR.

    The "granularity of the chemical substance" of film stock allows for up to 80 lp/mm or details of .006mm size.
    How do you know that? How can you be sure that all films offer 0.006 mm minimum film grain whilst you are aware of "the difference will be  in the film stock of the period versus the Kodak stock tested and how well the negative was preserved". Your confusion resides in that if today's scanners may offer such high resolutions as 80 lp/mm then all films will provide "granularity" equal or smaller to that. That is what I question and according to what I watch from the '50s I have sound reasons to question that.
    nht said:
    And if you are watching it on iTunes you aren't watching it at the same bit rate as the blue ray which isn't going to be as good as the 4K master.  There will be a marked improvement when the 4K UHD release is made vs the 50th anniversary BD set.  Frankly, I don't believe you've ever watched Ben Hur in any format if you think the Ben Hur BR is low detail.  
    Bitrate doesn't matter in detecting whether the source is a digital recording or a digital scan of the analog film. I have worked several years with the prepress industry, scanned slides are always discernible from digital shots. You can always increase the resolution of the scan. What you can't is increase the granularity of the source slide. On the final print both digital scans and digital shots are visually equivalent. But when working on the screen an experienced eye recognizes a digital scan immediately. And yes, I have watched so many Ben-Hurs in my childhood in the 60s and during my adulthood that I feel free to recognize a bad version.
    nht said:
    Do you even understand that many "digital 4K" movies were edited in 2K digital intermediate even if shot on 5K?  And the upconverted 2K DI is what ends up on the 4K stream or UHD blue ray?  Plus all the FX was rendered in 2K?  And it still looks way better than the HD blue ray because it got regraded into HDR and has a high quality upconvert?
    No I don't understand. What I understand from your reporting is their editing tools does not work with 4K source and they need to downsample it to 2K to edit and resample it to 4K to release. Do not present a compromise (the best term I could find to avoid "fraud") as a feature.
    The scanner resolution doesn't matter in the test because they looked at the filmed test pattern on the negative under the microscope, not the digital scan.  While that is modern stock even if you assume a fairly large reduction in resolution there's enough data in 65mm p5 negatives for 4K. 

    Unless you can show otherwise the film industry says 65mm=8K, the tests on real world 35mm stock shows sufficient resolution for 4K and all you have done is hand wave with "sound reasons".  If you haven't seen the actual 4K master of Ben Hur you have zero idea of the actual quality and we aren't dealing with stills resolution but film resolution at 24 fps with motion and blur.  Your vaunted experience with slides isn't nearly as meaningful as you think.

    Bitrate matters in the ability to display detail as opposed to artifacts.  Low bitrate HD will show less detail than high bitrate HD.  If it looks like mush on iTunes as you claim (I don't own it on iTunes) and looks great on BR (which I do have) then it's not the source material but the delivered material. 

    Finally, the point of stating much of the 4K available today is upconverted from the 2K DI is to show that 35mm film over scanned at 8K and remastered into 4K DI and regraded to HDR will look very good if not equivalent on UHD BluRay and lower bitrate 4K.  Probably as good or better than the current set of pseudo 4K offerings of contemporary movies...which if you are watching on Netflix is about all the quality you get anyway regardless of the original source material.
    OK here is the thing: a digital scan and a digital shot differ significantly. Do you have any objection to that? Then go back to photography basics. Take a shot on the best film with a Pro camera (ignore the size for the moment) then scan this at X resolution. Then take the same shot with a Pro digital camera again at X resolution. Compare the two. You will see that the digital shot is significantly better in details than the best scan you can achieve (media differences about color accuracy, tonal range and alike remain).

    My starting point you're trying to suffocate with all those irrelevant general truths and knowledge showmanship was "If it was shot originally at 8K there would be a significant difference with the actual scan". If you have any test about this then feel free to find it study it again and again. I am done with this discussion.
    1) 4K is 8.3 MP. 
    2) Film is shot at 180 degree shutter angle or 1/48 of moving subjects for both digital and film except for those movies shot at a blazing 1/60.
    3) Unless your photographers are generally taking lots of space low res shots with motion blur with their DSLRs your experience isn't necessarily applicable.  In fact all of your intuition is likely wrong.

    If you take a scan using a 8000 DPI 4.9 DMax scanner and compare with a 8.3 MP downrez from a D810 it'll likely be pretty damn close when looking at a MTF test chart and likely not noticeably different with real images much less ones with motion blur.

    Plus you likely still haven't gathered all the resolution detail available from the best still film negative even with a $25K scanner.  Kodak TMAX 100 film had 125 lp/mm resolution.  Even at high ISO the TMAX 3200 had 85mm lp/mm.

    Finally, as a self-claimed "expert" who refuses to look at the MTF and spatial frequency charts and continue to talk about "grain" I'm thinking you've confused noise with resolution and the one that needs remedial photographic basics lessons is you.

    Finally THIS is your dumb assed "starting point":
    I suppose almost all of those classic films shot on cel film cannot be made 4K. Even if you scan the film at 4K this is not enough because the “granularity” of the chemical substance on the film must be fine enough to carry enough detail and be made 4K. Otherwise what you get will be enlarged grains just like enlarged pixels. So we may discard a whole cel film epoch.

    edited September 2017
Sign In or Register to comment.