Originally Posted by melgross
I understand this my friend. I've been working with it for over 40 years, and have a couple degrees in biology. I know the limits. But you are wrong in your numbers. First of all resolution is defined as line pairs, not lines.
First of all, while spatial frequency is commonly measured as line pairs, digital resolution is defined as lines (hence 1080) vs lp/mm used for photography and human acuity can be measured in the ability to resolve a SINGLE line and not a line pair.
Which if you had read and understood you'd not have made the statements you did above about how ridiculous it was to want higher than 1080 resolution.
"Along this line, NHK has developed an ultrahigh-definition video system with a
7680x4320/60p format to explore the future prospects of broadcasting systems (13). We
reported that the 7680x4320 system, which gives the viewers a viewing angle of 100˚ at an
optimum viewing distance, enhances the visual experience (14, 15, 16)."
If 1080p was all that is needed then you wouldn't be able to enhance the visual experience with higher resolutions. What that document is saying is that 1080 is significantly better than SD and 720 only marginally so....therefore 1080p is far preferred. This was hardly news in 2005 and certainly not in 2011.
And it clearly indicates what you get from higher resolution: greater viewing angle. Which completely is opposite of what you claimed...that with greater resolution you have to greater field of view tradeoffs. In fact, as shown by your own reference, the opposite is true...just as I stated.
We are nowhere near the resolution required to saturate human visual perception with 1080p.
Originally Posted by melgross
I know a hell of a lot more about this than you do. It's nice that you're going to the Internet for your answers, but they're simplified as well.
It's interesting that you can claim to know a hell of a lot more about something without knowing anything about the person you're making the claim against. Arrogant much?
In any case, your appeals to personal expertise would be more compelling if you hadn't written stuff that is clearly wrong.
Video is very different from what we can see in real life. In addition, while it's certainly true that we can, in theory, see something that is below our visions' ability, it's only under very special circumstances,
Special circumstances like "sky"? What an esoteric example! Why I bet no one on this forum has ever seen sky!
The other "special circumstance" that comes to mind are the fine lines inherent in digital displays. There's even a name for this for LCD front projectors: The Screen Door Effect. When you can see the LCD grid lines in scenes so much that the image appears to be behind a screen door.
Even with 1080p I can generally discern the grid in light colored scenes on LCD projectors. DLP projectors are tighter (greater percentage fill) so not as much and some LCD projectors use a diamond or other kind of pattern to reduce this effect.
Contrast is extremely important for this, and the contrast ratios in the real world exceed those on film and in video by a great amount. The two aren't comparable.
The lower the contrast, the less resolution. So a very small bright light coming from a small hole in a black field, in a dark room will be seen, even though it's far smaller than out acuity allows. But make that a while spot in similar viewing conditions, and we won't see ti. Make the spot larger, and when it equals out acuity, we will see it again. But make it a light grey, and we won't see it until it's made larger again.
Keep making that spot darker, and at some point, we won't be able to see it at all, no matter how large it becomes.
I read a lot of arguments about how much we can see, but rarely the practical realization that this is almost impossible with film and video. Even if a few objects can sometimes be seen on a screen because of some unusual circumstances of local contrast, it doesn't work for the picture overall. And it's not those specific things that make it correct.
Oh bullshit. If folks have a term that describes the effect of seeing the sub-pixel width lines
on LCD front projectors there's certainly enough contrast ratio that you can see the effect in real life on real life equipment that people have in their homes. It ALSO proves that many people can resolve lines smaller than pixel size even on 1080p projectors which indicates that the desire for greater than 1080p resolution is not as ridiculous as you portrayed.
What you're trying to do here is throw out enough unrelated material to make it seem like what you wrote was misinterpreted. Unfortunately for you scathing ridicule leaves little margin for claiming misinterpretation of the nuances.
There's a famous paper on vision that's been referred to over the years, but it's flawed, because it only used green pixels on a Commodore screen.
A famous paper that no one here has referenced.
The problem with this, of course, is that we're much more sensitive to green yellow light than other colors, and can see finer lines in green. Yet, that study is referred to as being almost the definitive answer, when it's not. A lot of what we read comes from that study, though, even it doesn't state its findings as definite. I had it bookmarked, but can't find it.
So famous and fundamental that google can't find it. You really want folks to believe that what folks know about human visual acuity comes from some study that used Commodore computers that had a max resolution of 320x200?
We're also talking about 20/20, not this fewer who have even better vision, because we may as well talk about those with inferior vision. So it's the average person we're talking about, and that's what these standards refer to.
Too bad medical literature says different. From the 1862 De Haan population study to the 1995 Elliot study of the changes to human acuity as a function of age clearly shows the average population has better than 20/20 vision until around age 55.
Oddly, I don't think they had commodore computers in 1862.
You can find this in Duane's Clinical Ophthalmology, Volume 5.
Even folks that started with less than stellar vision often end up with 20/16 vision. Correcting to 20/16 is common for LASIK procedures.
Originally Posted by melgross
Actually, resolution is only a part of the issue. In fact, to a certain extent, it's the least important part. It's why even the best high resolution video games look fake, while a very low Rez photo looks "real". You've got part of that right in your last sentence.
Too bad that resolution was the only part you ridiculed and therefore the most relevant part of the discussion. I would argue that it's not the least important part given that it's tied to what I (and many others) believe IS the most important aspect in immersive experiences: field of view.
Sufficient field of view trumps both contrast and color for immersive video presentation but only if the resolution isn't so pixellated to be distracting and pulls you out of the immersion effect.