Google promises to dramatically shrink 4K bandwidth with upcoming VP10 video codec

135

Comments

  • Reply 41 of 95
    calicali Posts: 3,494member
    "Google Promises"

    Nice headline.
  • Reply 42 of 95
    Originally Posted by foggyhill View Post

    There are actual charts, and believe me, 4K makes no sense at a normal viewing distance unless yo

     

    I can see the pixels on a 70” 1080p TV at 15 feet, so I’ve never put much stock in those charts. They need to be tuned way down.

  • Reply 43 of 95
    razorpitrazorpit Posts: 1,796member
    Quote:

    Originally Posted by cali View Post





    Ahh... I remember the Nintendo hating magazines claiming UMD was gonna be HUGE and destroy the Nintendo DS.

    I'll never forget the article where they called it the biggest news of the year. 2nd place was the invention of the Wii remote. ????



    Is Memory Stick that annoying thing called Pro Stick II or something? The one they require for the PSP/Vita and their cameras?

    Or is that another failed format?



    I remember when Sony's Mavica used a 3.5" floppy.  Pictures were only 640 x 480 if I remember correctly.  The thing ate AA batteries like no one's business.  I want to say it was like 8-10 at a time?

     

    Later I remember wondering WTF when they introduced the memory stick, wondering why they didn't just use a CF card like everyone else.  

  • Reply 44 of 95
    I can see the pixels on a 70” 1080p TV at 15 feet, so I’ve never put much stock in those charts. They need to be tuned way down.

    I'm not TVologist but I think there are at least a couple things at play. Your eyesight could be good enough that the "normal" range doesn't work for you, but I think there is a more complex mechanism in place that allows your brain to discern the pixels when a very similar image alters slightly between frames (i.e.: a video), that you wouldn't be able to make out if were just a 1920×1080 still image. My hypothesis is based on numerous illusions we've seen our entire lives where color and size are being rendered incorrectly by our brains but which becomes clear when a side-by-side comparison is made.

    Here is one site I duckled*…

    "Believe it or not, but the pieces A/B/C all have the same color. Use any color picker, graphic program or simply cover the remainder with your hand to see for yourself."

    700

    * Like googled, but with Duck Duck Go?
  • Reply 45 of 95
    FORMAT WARS!!!!!!
  • Reply 46 of 95
    Quote:

    Originally Posted by foggyhill View Post

     

     

    There are actual charts, and believe me, 4K makes no sense at a normal viewing distance unless yo



    Considering living room size, central vision vs peripheral and how wide our field of view is, I'd say being closer than 1.5 times the TV size makes no sense (even though in theory you could go closer as chart shows). That means only the upper level of the blue UHD line makes any sense for your eyes, and how furniture is normally aranged.

     

    This means 65 inch is the absolute minimum TV size one should contemplate if buying a 4K (if you don't mind sitting 8 feet away (meaning sofa is 6 feet from this big ass TV).  A more realistic scenario has the sofa at 8-10 feet minimum and a 75-85 inch TV being the target size. That's one hell of a big TV :-).

     

    For 3D content though, it would make more sense since you'd have half the original resolution, so you could sit closer.

     

    8K makes no sense for movies (it takes a 200 inch screen were you're sitting 14 feet away to make a difference!)

    It's fantastic though for a whole wall multimedia wall.




    Thanks for the helpful answer.

  • Reply 47 of 95
    Quote:

    Originally Posted by Tallest Skil View Post

     

     

    I can see the pixels on a 70” 1080p TV at 15 feet, so I’ve never put much stock in those charts. They need to be tuned way down.




    Maybe. The question remains if I would ever notice an improvement watching a movie at 8k vs. 4k  at same panel size and reasonable distance.

    I mean not in the way as "will we ever need more than 640kB", as there might be applications where you want to get much closer, or have a much bigger display. Eyes and the connected physics, however, aren't expected to change that soon ;-) 

    Also, I still wonder if turning an "old" movie into 4k, 8k is "native" or useless upscaling.

  • Reply 48 of 95
    foggyhill wrote: »
    There are actual charts, and believe me, 4K makes no sense at a normal viewing distance unless yo<img alt="" class="lightbox-enabled" data-id="62266" data-type="61" src="http://forums.appleinsider.com/content/type/61/id/62266/width/350/height/700/flags/LL" style="; width: 350px; height: 295px">



    Considering living room size, central vision vs peripheral and how wide our field of view is, I'd say being closer than 1.5 times the TV size makes no sense (even though in theory you could go closer as chart shows). That means only the upper level of the blue UHD line makes any sense for your eyes, and how furniture is normally aranged.

    This means 65 inch is the absolute minimum TV size one should contemplate if buying a 4K (if you don't mind sitting 8 feet away (meaning sofa is 6 feet from this big ass TV).  A more realistic scenario has the sofa at 8-10 feet minimum and a 75-85 inch TV being the target size. That's one hell of a big TV :-).

    For 3D content though, it would make more sense since you'd have half the original resolution, so you could sit closer.

    8K makes no sense for movies (it takes a 200 inch screen were you're sitting 14 feet away to make a difference!)
    It's fantastic though for a whole wall multimedia wall.

    Are you saying that if you test people sitting 8' away from both a 60" 1080p and 4K sets with frame rate, color, brightness, and with the same video but with the 4K video reencoded for 1920x1080 that it would be impossible to tell the diference?
  • Reply 49 of 95
    I believe H.265 will be used in Ultra HD Blu-Ray. If so, then that will be the standard.
  • Reply 50 of 95
    foggyhillfoggyhill Posts: 4,767member
    Quote:
    Originally Posted by Tallest Skil View Post

     

     

    I can see the pixels on a 70” 1080p TV at 15 feet, so I’ve never put much stock in those charts. They need to be tuned way down.


     

    Well, charts like those are for normal average vision. I'm really tired of anecdote being used a justification BTW.

    It's used by basically every science basher around.

  • Reply 51 of 95
    foggyhillfoggyhill Posts: 4,767member
    Quote:
    Originally Posted by SolipsismY View Post





    Are you saying that if you test people sitting 8' away from both a 60" 1080p and 4K sets with frame rate, color, brightness, and with the same video but with the 4K video reencoded for 1920x1080 that it would be impossible to tell the diference?

     

    If source is exactly the same (which is often not the case, which makes testing this really hard) with the best possible encoding and the original material actually had 4K (or more of info to encode), yes that's the case for the AVERAGE viewer.

     

    Some people with better than average vision would do a bit better than that obviously but someone able to see the difference from say 15 feet would be a very very rare freek (rarer than 1 in a 1000)

     

    This test is nearly impossible to do in real life for a lcd or Amoled panel because there is so much factors in other aspect than resolution to muck it up the eval.

     

    One funny thing is that while the eye is technically able to see smaller details than that on printed high contrast black on white reflective and well lighted material, this doesn't per say translate to a LCD/OLED, which are typically not as high contrast, especially at high luminosity levels. 

     

    One other thing... Those studies are done with static material... Very few studies have been done on moving material, because probably doing a A vs B test would be so hard and fraught with pitfalls.

     

    People who say that yes we can see to 600 DPI routinely drag out studies using this kind of materials not representative of current (or in the next 5 years, TVs).

     

    Often, what people see as a difference in the resolution is in fact either a difference/artefact in the source, or merely a difference in other aspects of the TV.

  • Reply 52 of 95
    foggyhill wrote: »
    If source is exactly the same (which is often not the case, which makes testing this really hard) with the best possible encoding and the original material actually had 4K (or more of info to encode), yes that's the case for the average viewer.

    Some people with better than average vision would do a bit better than that obviously but someone able to see the difference from say 15 feet would be a very very rare freek

    This test is nearly impossible to do in real life for a lcd or Amoled panel because there is so much factors in other aspect than resolution to muck it up the eval.

    One funny thing is that while the eye is technically able to see smaller details than that on printed high contrast black on white reflective and well lighted material, this doesn't per say translate to a LCD/OLED, which are typically not as high contrast, especially at high luminosity levels. 

    People who say that yes we can see to 600 DPI routinely drag out studies using this kind of materials not representative of current (or in the next 5 years, TVs).

    Often, what people see as a difference in the resolution is in fact either a difference/artefact in the source, or merely a difference in other aspects of the TV.

    There seems to be something missing here. I don't know what to call it but I would bet that our brains can discern pixels from a distance more easily if the images are moving than if they were still, and yet all the material and maths about these distances only seem to deal with a static formula. Even you flat-out stated "4K makes no sense at a normal viewing distance" which is not a complete answer.

    Additionally, you wrote, "8K makes no sense for movies (it takes a 200 inch screen were you're sitting 14 feet away to make a difference!)" as if you're saying that a "normal viewing distance" is either much further than 14' feet away (which it isn't), or that if you sit closer to the screen it gets harder to discern pixels (which it doesn't). Something isn't jibing and you seem to have accidentally deleted a part of your original comment before you posted, so perhaps that's what wrong.
  • Reply 53 of 95
    mstonemstone Posts: 11,510member

     



    Quote:
    Originally Posted by Suddenly Newton View Post

     

    Can't wait for Adobe to glom VP10 for the next version of Flash.



    Flash doesn't even encode. You use Media Encoder which is a separate app. Flash does not even support the old FLV or F4V formats any longer and also does not support WebM or VP9. It defaults to H.264.

     

     

    Quote:

    Originally Posted by hmm View Post

     

    Why worry about that? Adobe hasn't focused on Flash for some time. It's insignificant to their revenue at this point.


    Flash is just part of the CC package. I find it to be the absolute fastest and easiest HTML 5 authoring application out there. Adobe certainly hasn't given up on Flash as an authoring platform. There is an impressive list of new features released in CC compared to version CS6 including WebGL support and 64-bit iOS export. In fact they just rewrote the entire format from the ground up for AS 3 with XML based FLA files.

  • Reply 54 of 95
    foggyhillfoggyhill Posts: 4,767member
    Quote:
    Originally Posted by SolipsismY View Post





    There seems to be something missing here. I don't know what to call it but I would bet that our brains can discern pixels from a distance more easily if the images are moving than if they were still, and yet all the material and maths about these distances only seem to deal with a static formula. Even you flat-out stated "4K makes no sense at a normal viewing distance" which is not a complete answer.



    Additionally, you wrote, "8K makes no sense for movies (it takes a 200 inch screen were you're sitting 14 feet away to make a difference!)" as if you're saying that a "normal viewing distance" is either much further than 14' feet away (which it isn't), or that if you sit closer to the screen it gets harder to discern pixels (which it doesn't). Something isn't jibing and you seem to have accidentally deleted a part of your original comment before you posted, so perhaps that's what wrong.

     

    Well, it's the only scientific answer we got.

    Everything else is subjective bull crap.

    TV makers obviously push 4K hard despite the near uselessness for most in smaller panels,

    especially right now, because there is more profits to be made in those new TV's than in proving other parts of 1080P (like Gamut).

    Marketing always go for the simple nail to pound on.

     

    Improvements in other aspects though in 4K panels using new tech often make them better for those reasons, not because of higher resolution.

     

    As I stated, decent studies on moving content that are reproducible are rare to non existent.

    Probably because of cost and how hard to ensure the experience is non biased.

     

    There are other things that affect how good something look when moving other than resolution.

    Anyone who has seen a 1080P plasma vs 4K LCD would know that'S the case.

     

     

    As for 8K, I got a bit mixed up... Will come back to you on that one.

    I do see the use a 8K as I said in full wall Media/Info panels.

    Were people would be looking at it much closer than a typical TV (more like a computer).

    You could use them for the ultimate wallpaper :-).

  • Reply 55 of 95
    mstonemstone Posts: 11,510member
    Quote:
    Originally Posted by SolipsismY View Post





    I don't know what to call it but I would bet that our brains can discern pixels from a distance more easily if the images are moving than if they were still, ...

    The motion has something to do with it. 24-bit color is 16 million something different colors but a 1080 video has almost 21 million pixels. Since there are more pixels than palette there is going to be interpolation and with moving images that can result in neighboring pixels shifting around a bit because each frame is being interpolated slightly differently.

     

    Edit: There are professional displays that have 30-bit so that would several million more colors.

  • Reply 56 of 95
    mstone wrote: »
    The motion has something to do with it. 24-bit color is 16 million something different colors but a 1080 video has almost 21 million pixels. Since there are more pixels than palette there is going to be interpolation and with moving images that can result in neighboring pixels shifting around a bit because each frame is being interpolated slightly differently.

    Edit: There are professional displays that have 30-bit so that would several million more colors.

    I'd think the brain could potentially see that, even at "normal viewing distances," while pixels 1/4 their size, even at 24-bit, as it could blend with more subtly.
  • Reply 57 of 95
    foggyhillfoggyhill Posts: 4,767member
    Quote:
    Originally Posted by SolipsismY View Post





    I'd think the brain could potentially see that, even at "normal viewing distances," while pixels 1/4 their size, even at 24-bit, as it could blend with more subtly.

     

    Yes, aliasing effects, and gradation can benefit somewhat from 4K beyond what the eyes can see individually (through a kind of perceptual blending), but it still won't mean a 4K is worthwhile in a 40 inch panel seen at 8 feet for the average. This effect is not well quantified yet, not even by 4K proponents. And, there are other motion artefacts in LCDs that may counter this effect.

     

    Also, Improving resolution without improving contrast, color gamut, dynamic range, etc doesn't lead to better image. My 27 inch 2.5K Acer IPS monitor certainly doesn't give me a better image for viewing movies than my 4 year old 1080P Samsung Plasma because it has a far worse contrast (though an excellent color reproduction).

     

    For now, most current 4K panels look better than older 1080P LCD's simply because they,re using newer tech  (the panel overall is better), not because 4K is inherently better. A top end 1080P plasma will blow most of them away without a sweat and probably was cheaper too.

     

    Too bad panel makers couldn't make a profit out of plasma because nobody really seamed to care for higher quality (so they couldn't price them higher and protect their margins). Can't believe the garish LCD's with crazy high brighness/contrast I see at many houses I visit. The colors are just completely horrible. LCD's are slowly improving, but they haven'T caught on with even 5 years old plasmas!

  • Reply 58 of 95
    foggyhill wrote: »
    ...but it still won't mean a 4K is worthwhile in a 40 inch panel seen at 8 feet for the average.

    I don't think anyone looking at a 4K UHD set that sit 8' or more from their TV are considering a 40" model in 2015 and onwards. What I was talking about was a still image as opposed to moving images that are within a window that the brain can easily detect at 1080p v 2060p. I did link examples of optical illusions that the brain tries to correct in post and noted that the math in your chart (and all others) seems to only consider a static differential of pixels via a basic math formula.

    What I'd like to see are studies that show whether people can tell if they are watching 1080p or 2060p on a 60" set from 8' away with "normal" vision. Note that puts it completely inside but not too far inside the 1080p range of the chart. I hypothesis, based on some very rudimentary tests at stores that sell 1080p and 2060p sets that I definitely would be abel to see the difference and have a better experience with a 60" 4K UHD set over the 1080p with everything else being the same from an 8' distance… and I imagine that most people would, hence my concern with the graph's accuracy.
  • Reply 59 of 95
    foggyhillfoggyhill Posts: 4,767member
    Quote:
    Originally Posted by SolipsismY View Post





    I don't think anyone looking at a 4K UHD set that sit 8' or more from their TV are considering a 40" model in 2015 and onwards. What I was talking about was a still image as opposed to moving images that are within a window that the brain can easily detect at 1080p v 2060p. I did link examples of optical illusions that the brain tries to correct in post and noted that the math in your chart (and all others) seems to only consider a static differential of pixels via a basic math formula.



    What I'd like to see are studies that show whether people can tell if they are watching 1080p or 2060p on a 60" set from 8' away with "normal" vision. Note that puts it completely inside but not too far inside the 1080p range of the chart. I hypothesis, based on some very rudimentary tests at stores that sell 1080p and 2060p sets that I definitely would be abel to see the difference and have a better experience with a 60" 4K UHD set over the 1080p with everything else being the same from an 8' distance… and I imagine that most people would, hence my concern with the graph's accuracy.

     

    One of the issue in stores is that often they get all the same source, which is then upscaled by the TV, or they get different sources which may or may not be the same. The issue of 1080P panels being made with older tech than newer panels blurs this test completely making any assessment iffy at best.

     

    Yes, it would be interesting to see if there is an impact that comes from blending. But the great difficulty in actually doing this test without bias makes me think that it won't be done anytime soon.

     

    Also, as I said, there are other effects that occur on LCD,s that may make the point moot anyway. Not to mention a lot of post processing that is activated in stores that messes up the original signal further making any judgement of what's on screen suspect.

     

    I'd be ready to beleive that the effect of higher resolution can have a perceptual effect even when pixels can't be seen individually, but only to extend the transition zone between "worth it to buy" areas.

     

    Like you said, that may make the 60 inch 4K TV worth while at 8 feet even though in theory you shouldn't be able to see the pixels, but not make the 50 incher worth while. At 50 inch, buy a better 1080P TV instead.  At 60 inch, it may or may not be worthwhile to buy 4K depending if your able to get a higher quality 1080P TV at that size than a 4K. At 70 inch, it is probable that 4K panels will be in a top notch quality TV so you don't have to choose anything else than 4K.

  • Reply 60 of 95
    ksecksec Posts: 1,569member
    Quote:

     One QT question regarding 4K content in general: Is 4K only available "natively" for current productions? How about old movies from the 80s, 90s etc. Is the source material analog, hence nearly unlimited w.r.t. digitizing? Or is there a natural limit beyond which 4K, 8K, nK is just upscaling?


     

    Both Korea and Japan has had Native 4K content for a while. And they are planning on 8K for Olympics 2020.

     

    In terms of the Viewing distance, you should be looking at the lowering boundary of UHD rather then up.

Sign In or Register to comment.