or Connect
AppleInsider › Forums › Mobile › iPod + iTunes + AppleTV › Apple selling half a million Apple TVs per quarter but no update planned for Q3
New Posts  All Forums:Forum Nav:

Apple selling half a million Apple TVs per quarter but no update planned for Q3 - Page 4

post #121 of 138
Quote:
Originally Posted by Tallest Skil View Post

But he said (and I agreed) that Super Hi-Vision IS, for all intents and purposes, the maximum resolution needed to be a retina display.

Until you turn your head.
post #122 of 138
Quote:
Originally Posted by e1618978 View Post

A human with 20/20 vision can see lines spaced 1 arc minute from each other, and 30 degrees of viewing angle provides a good immersive experience.

So a "future proof" TV would be able to display 3600 discrete lines horizontally - 7200 pixels across (one pixel for the line, one for the space between the lines). Something like a 3600 vertical by 7200 horizontal progressive scan at 75 Hz would be "future proof" until we develop bionic eyes - maybe 4000x8000 if you want to include the rare people with exceptional vision.

And so you intend to sit two feet in front of your large screen Tv? While throwing that number around, do you understand what it means in actual viewing? I don't think you do.

In order to see all the lines of a display, you need to sit a certain distance from it, and no further.

So, if you've got a small 30" Tv, and expect to get the full Rez from it, you can't sit more than about 4 feet away, assuming 20/20 vision. If you have a (non-existent, so far) 1440 Rez display, you have to sit no more than about 2.5 feet away.

It gets better with larger displays, of course. With a 50" model, it's about 6.5 feet for 1080p, and 5 feet for 1440.

And you want, what, 4,000p? So you intend to watch with your nose pressed against the screen?
post #123 of 138
Quote:
Originally Posted by melgross View Post

And so you intend to sit two feet in front of your large screen Tv? While throwing that number around, do you understand what it means in actual viewing? I don't think you do.

In order to see all the lines of a display, you need to sit a certain distance from it, and no further.

So, if you've got a small 30" Tv, and expect to get the full Rez from it, you can't sit more than about 4 feet away, assuming 20/20 vision. If you have a (non-existent, so far) 1440 Rez display, you have to sit no more than about 2.5 feet away.

It gets better with larger displays, of course. With a 50" model, it's about 6.5 feet for 1080p, and 5 feet for 1440.

And you want, what, 4,000p? So you intend to watch with your nose pressed against the screen?

My experience with NHK's UHDV aligns a lot more with e1618978's figures than yours.
post #124 of 138
Quote:
Originally Posted by JeffDM View Post

My experience with NHK's UHDV aligns a lot more with e1618978's figures than yours.

They aren't mine. They are the standard industry numbers, and his make no sense. They can't align better.
post #125 of 138
Quote:
Originally Posted by melgross View Post

And so you intend to sit two feet in front of your large screen Tv?

I specified a 30 degree field of view, which will be uncomfortable on small screens but just fine with front projectors.

EDIT - I had an error in my math earlier, 30 degrees x 60 arc minutes is 1800 discrete lines, so 2000x4000 is all that is required for 20/20 vision, 4000x8000 is for 20/10 vision.
45 2a3 300b 211 845 833
Reply
45 2a3 300b 211 845 833
Reply
post #126 of 138
Quote:
Originally Posted by e1618978 View Post

I specified a 30 degree field of view, which will be uncomfortable on small screens but just fine with front projectors.

EDIT - I had an error in my math earlier, 30 degrees x 60 arc minutes is 1800 discrete lines, so 2000x4000 is all that is required for 20/20 vision, 4000x8000 is for 20/10 vision.

I'd just like to see you get more specific, and mention the actual distance and the diagonal of the screen.
post #127 of 138
Quote:
Originally Posted by melgross View Post

I'd just like to see you get more specific, and mention the actual distance and the diagonal of the screen.

The viewer sits 1.87 times as far from the screen as the screen is wide. So for a 100" wide screen, you would have a 30 degree field of view if you sat 15.58 feet away.

45 2a3 300b 211 845 833
Reply
45 2a3 300b 211 845 833
Reply
post #128 of 138
Quote:
Originally Posted by e1618978 View Post

The viewer sits 1.87 times as far from the screen as the screen is wide. So for a 100" wide screen, you would have a 30 degree field of view if you sat 15.58 feet away.


That's not the way you measure performance from a resolution perspective. You are talking about the motion picture standard of field of view. Unfortunately, field of view and resolution standards conflict. If you sit that far from the screen with a 1080p screen, you won't get all of the resolution. The situation gets worse the higher the resolution is. You have to decide whether field of view, or resolution is more important.
post #129 of 138
Quote:
Originally Posted by melgross View Post

That's not the way you measure performance from a resolution perspective. You are talking about the motion picture standard of field of view. Unfortunately, field of view and resolution standards conflict. If you sit that far from the screen with a 1080p screen, you won't get all of the resolution. The situation gets worse the higher the resolution is. You have to decide whether field of view, or resolution is more important.

It does not matter how far the screen is, the limit of your vision is 3600x1800. That is basically the definition of 20/20 vision, if you can't see that, then you don't have 20/20 vision. Somebody with 20/20 vision can see two lines 1 arc minute away from each other, and there are 1800 arc minutes in 30 degrees (i.e. 3600 pixels). If the screen is 10 feet away, the two lines are closer together to be 1 arc minute apart, but the maximum resolution of a 20/20 eye is still 3600x1800.
45 2a3 300b 211 845 833
Reply
45 2a3 300b 211 845 833
Reply
post #130 of 138
Quote:
Originally Posted by e1618978 View Post

It does not matter how far the screen is, the limit of your vision is 3600x1800. That is basically the definition of 20/20 vision, if you can't see that, then you don't have 20/20 vision. Somebody with 20/20 vision can see two lines 1 arc minute away from each other, and there are 1800 arc minutes in 30 degrees (i.e. 3600 pixels). If the screen is 10 feet away, the two lines are closer together to be 1 arc minute apart, but the maximum resolution of a 20/20 eye is still 3600x1800.

I understand this very well, and your explanation isn't correct. The numbers you are giving are exclusively for 1080p. They are NOT for a higher resolution system.

It should also be noted that for THX viewing, a 40 degree angle is recommended. Pick your
poison.

At any rate, for viewing distance, there are formulas that must be used, if greatest resolution at the eye is wanted.


Where:
VD: Viewing distance
DS: Display's diagonal size
NHR: Display's native horizontal resolution (in pixels)
NVR: Display's native vertical resolution (in pixels)
CVR: Vertical resolution of the video being displayed (in pixels)
Note: Make sure the angle mode is set to degrees when calculating the tangent. If using Excel, you must multiply the angle by PI()/180. If DS is given in inches, VD will be in inches. If VD in meters is desired, multiply VD by 2.54 and divide by 100.

These are some of the factors that must be taken into account when doing this. You have to plug the resolution of the system I when you do it. For 2K, you must use that instead of 1080.

There are numerous articles on this.

This is a pretty good, well known, and fairly simple (mathematically) article about this. Notice that all the numbers are based on 720 and 1080p. As resolution increases, so does the arc of view. For 1080p, it's about 30 degrees. For 720p, it's about 20. Higher resolutions will require a greater arc. You can't use 30 degrees for anything higher than 1080p.

Use the formula to derive that arc, and thus, the distance for full resolution viewing.
post #131 of 138
mel - I just don't think you are understanding my point. Your formulas (from the movie industry) don't match mine (which are from optometry). I tend to trust an optometrist over a movie tech when trying to figure out the maximum resolution that a person can see.

The definition of 20/20 vision is based on angles, if you have a 40 degree field of view then the maximum resolution of a 20/20 set of eyes should be 40 (degrees) * 60 (arc minutes/degree) * 2 (one pixel for line, one for space between lines) = 4800 pixels wide.

If the movie industry has a different formula that they use, I think that they are underestimating the maximum resolution of the human eye.
45 2a3 300b 211 845 833
Reply
45 2a3 300b 211 845 833
Reply
post #132 of 138
Quote:
Originally Posted by melgross View Post

And so you intend to sit two feet in front of your large screen Tv? While throwing that number around, do you understand what it means in actual viewing? I don't think you do.

In order to see all the lines of a display, you need to sit a certain distance from it, and no further.

Sure, but that limit is not 1080. In fact the resolution required to reproduce reality even for folks with 20/20 vision far exceeds 1080.

You cannot, for example, perceive very fine power lines against the sky on a 1080p display even when you can see them in reality. 1 arc minute is a simplification/rule of thumb. In reality we can see a black line on a bright background if the line is at least 0.5-0.8 arc seconds in width. Hence our ability to see power lines against the sky. When the limits of resolution are exceeded the lines stop appearing thinner but instead are perceived as lighter but human vision system can still meaningfully perceive them. When that information isn't present we can sense their lack.

Groups of lines are harder to resolve but the eye can resolve two black lines on a bright background if the distance between the lines are at least 40 arc seconds.

Then you take into account that 20/20 vision isn't actually the average human ability but the LOWER limit of normal. Normal vision for healthy eyes is around 20/16 to 20/12.

Quote:
So, if you've got a small 30" Tv, and expect to get the full Rez from it, you can't sit more than about 4 feet away, assuming 20/20 vision. If you have a (non-existent, so far) 1440 Rez display, you have to sit no more than about 2.5 feet away.

This is false as indicated above. And if you've seen a Gigapxl image in real life you realize that the we can sense detail that is missing from any 1080p HD image. That lack makes it less lifelike.

1440 displays are not non-existent. The Dell U2711 is a 2560x1440 monitor. It's $999 and you can buy one today. And you don't need to sit closer 2.5 feet away in order to gain benefit from having one.

Quote:
It gets better with larger displays, of course. With a 50" model, it's about 6.5 feet for 1080p, and 5 feet for 1440.

And you want, what, 4,000p? So you intend to watch with your nose pressed against the screen?

No, he wants a display that can actually replicate what you see with your eyeballs. Perhaps your ridicule would have more merit if you actually knew what the hell you were talking about as opposed to THINK you know what the hell you are talking about.

Oh, and from a strictly home theater perspective, the ability to sit 5 feet from a 1440p display vs 6.5 feet for a 1080p one means you aren't sitting in the equivalent of the FURTHEST seat in a movie theater but somewhere closer to the middle. The 30 degrees horizontal viewing angle is the MINIMUM that the NHK study showed would induce immersion. Being able to get to 40+ is far better even if this means a closer seating placement than most folks are comfortable for a 50" HDTV.

A 4K display means that you could achieve the theater experience in the home if you don't normally sit in the furthest row of the theater but rather close to the front.

A 65" Bravia is only $4499 so it isn't unreasonable to expect that folks will get ever larger HDTV sets where closer seating (as a function of screen height) becomes more common place. Frankly, my kids like sitting way close to both the HDTV and the 100" HT FP setup I have. And yeah, at 5 feet from a 100" screen the 1080 rez is pretty craptastic but they like it there.
post #133 of 138
Quote:
Originally Posted by melgross View Post

That's not the way you measure performance from a resolution perspective.

It is one way you can. The higher the fov the higher the required resolution. We typically speak of it differently but the basic relationship can be described that way.

Quote:
You are talking about the motion picture standard of field of view. Unfortunately, field of view and resolution standards conflict.

No they don't. They are completely interrelated and the opposite of conflicted. The minimum acceptable resolution (1080) was determined by the minimum desired field of view (30 degrees HVA).

Quote:
If you sit that far from the screen with a 1080p screen, you won't get all of the resolution. The situation gets worse the higher the resolution is. You have to decide whether field of view, or resolution is more important.

This is completely wrong. You have to decide whether field of view or resolution is more important WHEN THE RESOLUTION IS TOO LOW.

The higher the resolution the BETTER the situation because you can get a better field of view than the minimum and STILL have sufficient resolution that you meet the desired minimum for 20/20 vision...which I already showed in the previous post isn't actually sufficient to recreate reality in sufficient detail.

It'll be a long time before displays are actually high enough in resolution that you can use a HDTV to replace a window and not be able to tell the difference resolution wise. After that you have contrast and color to deal with.

1080p is a starting point, and a far cry from an end point.
post #134 of 138
Quote:
Originally Posted by e1618978 View Post

mel - I just don't think you are understanding my point. Your formulas (from the movie industry) don't match mine (which are from optometry). I tend to trust an optometrist over a movie tech when trying to figure out the maximum resolution that a person can see.

The definition of 20/20 vision is based on angles, if you have a 40 degree field of view then the maximum resolution of a 20/20 set of eyes should be 40 (degrees) * 60 (arc minutes/degree) * 2 (one pixel for line, one for space between lines) = 4800 pixels wide.

If the movie industry has a different formula that they use, I think that they are underestimating the maximum resolution of the human eye.

I understand this my friend. I've been working with it for over 40 years, and have a couple degrees in biology. I know the limits. But you are wrong in your numbers. First of all resolution is defined as line pairs, not lines.

I also forgot to include the link to the article, which I will add here.

http://www.nhk.or.jp/digital/en/tech...pdf/02_1_1.pdf
post #135 of 138
Quote:
Originally Posted by nht View Post

Sure, but that limit is not 1080. In fact the resolution required to reproduce reality even for folks with 20/20 vision far exceeds 1080.

You cannot, for example, perceive very fine power lines against the sky on a 1080p display even when you can see them in reality. 1 arc minute is a simplification/rule of thumb. In reality we can see a black line on a bright background if the line is at least 0.5-0.8 arc seconds in width. Hence our ability to see power lines against the sky. When the limits of resolution are exceeded the lines stop appearing thinner but instead are perceived as lighter but human vision system can still meaningfully perceive them. When that information isn't present we can sense their lack.

Groups of lines are harder to resolve but the eye can resolve two black lines on a bright background if the distance between the lines are at least 40 arc seconds.

Then you take into account that 20/20 vision isn't actually the average human ability but the LOWER limit of normal. Normal vision for healthy eyes is around 20/16 to 20/12.



This is false as indicated above. And if you've seen a Gigapxl image in real life you realize that the we can sense detail that is missing from any 1080p HD image. That lack makes it less lifelike.

1440 displays are not non-existent. The Dell U2711 is a 2560x1440 monitor. It's $999 and you can buy one today. And you don't need to sit closer 2.5 feet away in order to gain benefit from having one.



No, he wants a display that can actually replicate what you see with your eyeballs. Perhaps your ridicule would have more merit if you actually knew what the hell you were talking about as opposed to THINK you know what the hell you are talking about.

Oh, and from a strictly home theater perspective, the ability to sit 5 feet from a 1440p display vs 6.5 feet for a 1080p one means you aren't sitting in the equivalent of the FURTHEST seat in a movie theater but somewhere closer to the middle. The 30 degrees horizontal viewing angle is the MINIMUM that the NHK study showed would induce immersion. Being able to get to 40+ is far better even if this means a closer seating placement than most folks are comfortable for a 50" HDTV.

A 4K display means that you could achieve the theater experience in the home if you don't normally sit in the furthest row of the theater but rather close to the front.

A 65" Bravia is only $4499 so it isn't unreasonable to expect that folks will get ever larger HDTV sets where closer seating (as a function of screen height) becomes more common place. Frankly, my kids like sitting way close to both the HDTV and the 100" HT FP setup I have. And yeah, at 5 feet from a 100" screen the 1080 rez is pretty craptastic but they like it there.

I know a hell of a lot more about this than you do. It's nice that you're going to the Internet for your answers, but they're simplified as well.

Video is very different from what we can see in real life. In addition, while it's certainly true that we can, in theory, see something that is below our visions' ability, it's only under very special circumstances, and doesn't apply for general viewing. Contrast is extremely important for this, and the contrast ratios in the real world exceed those on film and in video by a great amount. The two aren't comparable.

The lower the contrast, the less resolution. So a very small bright light coming from a small hole in a black field, in a dark room will be seen, even though it's far smaller than out acuity allows. But make that a while spot in similar viewing conditions, and we won't see ti. Make the spot larger, and when it equals out acuity, we will see it again. But make it a light grey, and we won't see it until it's made larger again.

Keep making that spot darker, and at some point, we won't be able to see it at all, no matter how large it becomes.

I read a lot of arguments about how much we can see, but rarely the practical realization that this is almost impossible with film and video. Even if a few objects can sometimes be seen on a screen because of some unusual circumstances of local contrast, it doesn't work for the picture overall. And it's not those specific things that make it correct.

There's a famous paper on vision that's been referred to over the years, but it's flawed, because it only used green pixels on a Commodore screen. The problem with this, of course, is that we're much more sensitive to green yellow light than other colors, and can see finer lines in green. Yet, that study is referred to as being almost the definitive answer, when it's not. A lot of what we read comes from that study, though, even it doesn't state its findings as definite. I had it bookmarked, but can't find it.

We're also talking about 20/20, not this fewer who have even better vision, because we may as well talk about those with inferior vision. So it's the average person we're talking about, and that's what these standards refer to.
post #136 of 138
Quote:
Originally Posted by nht View Post

It is one way you can. The higher the fov the higher the required resolution. We typically speak of it differently but the basic relationship can be described that way.



No they don't. They are completely interrelated and the opposite of conflicted. The minimum acceptable resolution (1080) was determined by the minimum desired field of view (30 degrees HVA).



This is completely wrong. You have to decide whether field of view or resolution is more important WHEN THE RESOLUTION IS TOO LOW.

The higher the resolution the BETTER the situation because you can get a better field of view than the minimum and STILL have sufficient resolution that you meet the desired minimum for 20/20 vision...which I already showed in the previous post isn't actually sufficient to recreate reality in sufficient detail.

It'll be a long time before displays are actually high enough in resolution that you can use a HDTV to replace a window and not be able to tell the difference resolution wise. After that you have contrast and color to deal with.

1080p is a starting point, and a far cry from an end point.

Actually, resolution is only a part of the issue. In fact, to a certain extent, it's the least important part. It's why even the best high resolution video games look fake, while a very low Rez photo looks "real". You've got part of that right in your last sentence.
post #137 of 138
Quote:
Originally Posted by melgross View Post

I understand this my friend. I've been working with it for over 40 years, and have a couple degrees in biology. I know the limits. But you are wrong in your numbers. First of all resolution is defined as line pairs, not lines.

First of all, while spatial frequency is commonly measured as line pairs, digital resolution is defined as lines (hence 1080) vs lp/mm used for photography and human acuity can be measured in the ability to resolve a SINGLE line and not a line pair.

Quote:
I also forgot to include the link to the article, which I will add here.

http://www.nhk.or.jp/digital/en/tech...pdf/02_1_1.pdf


Which if you had read and understood you'd not have made the statements you did above about how ridiculous it was to want higher than 1080 resolution.

"Along this line, NHK has developed an ultrahigh-definition video system with a
7680x4320/60p format to explore the future prospects of broadcasting systems (13). We
reported that the 7680x4320 system, which gives the viewers a viewing angle of 100˚ at an
optimum viewing distance, enhances the visual experience (14, 15, 16)."

If 1080p was all that is needed then you wouldn't be able to enhance the visual experience with higher resolutions. What that document is saying is that 1080 is significantly better than SD and 720 only marginally so....therefore 1080p is far preferred. This was hardly news in 2005 and certainly not in 2011.

And it clearly indicates what you get from higher resolution: greater viewing angle. Which completely is opposite of what you claimed...that with greater resolution you have to greater field of view tradeoffs. In fact, as shown by your own reference, the opposite is true...just as I stated.

We are nowhere near the resolution required to saturate human visual perception with 1080p.

Quote:
Originally Posted by melgross View Post

I know a hell of a lot more about this than you do. It's nice that you're going to the Internet for your answers, but they're simplified as well.

It's interesting that you can claim to know a hell of a lot more about something without knowing anything about the person you're making the claim against. Arrogant much?

In any case, your appeals to personal expertise would be more compelling if you hadn't written stuff that is clearly wrong.

Quote:
Video is very different from what we can see in real life. In addition, while it's certainly true that we can, in theory, see something that is below our visions' ability, it's only under very special circumstances,

Special circumstances like "sky"? What an esoteric example! Why I bet no one on this forum has ever seen sky!

The other "special circumstance" that comes to mind are the fine lines inherent in digital displays. There's even a name for this for LCD front projectors: The Screen Door Effect. When you can see the LCD grid lines in scenes so much that the image appears to be behind a screen door.

Even with 1080p I can generally discern the grid in light colored scenes on LCD projectors. DLP projectors are tighter (greater percentage fill) so not as much and some LCD projectors use a diamond or other kind of pattern to reduce this effect.

Quote:
Contrast is extremely important for this, and the contrast ratios in the real world exceed those on film and in video by a great amount. The two aren't comparable.

The lower the contrast, the less resolution. So a very small bright light coming from a small hole in a black field, in a dark room will be seen, even though it's far smaller than out acuity allows. But make that a while spot in similar viewing conditions, and we won't see ti. Make the spot larger, and when it equals out acuity, we will see it again. But make it a light grey, and we won't see it until it's made larger again.

Keep making that spot darker, and at some point, we won't be able to see it at all, no matter how large it becomes.

I read a lot of arguments about how much we can see, but rarely the practical realization that this is almost impossible with film and video. Even if a few objects can sometimes be seen on a screen because of some unusual circumstances of local contrast, it doesn't work for the picture overall. And it's not those specific things that make it correct.

Oh bullshit. If folks have a term that describes the effect of seeing the sub-pixel width lines on LCD front projectors there's certainly enough contrast ratio that you can see the effect in real life on real life equipment that people have in their homes. It ALSO proves that many people can resolve lines smaller than pixel size even on 1080p projectors which indicates that the desire for greater than 1080p resolution is not as ridiculous as you portrayed.

What you're trying to do here is throw out enough unrelated material to make it seem like what you wrote was misinterpreted. Unfortunately for you scathing ridicule leaves little margin for claiming misinterpretation of the nuances.

Quote:
There's a famous paper on vision that's been referred to over the years, but it's flawed, because it only used green pixels on a Commodore screen.

A famous paper that no one here has referenced.

Quote:
The problem with this, of course, is that we're much more sensitive to green yellow light than other colors, and can see finer lines in green. Yet, that study is referred to as being almost the definitive answer, when it's not. A lot of what we read comes from that study, though, even it doesn't state its findings as definite. I had it bookmarked, but can't find it.

So famous and fundamental that google can't find it. You really want folks to believe that what folks know about human visual acuity comes from some study that used Commodore computers that had a max resolution of 320x200?

Quote:
We're also talking about 20/20, not this fewer who have even better vision, because we may as well talk about those with inferior vision. So it's the average person we're talking about, and that's what these standards refer to.

Too bad medical literature says different. From the 1862 De Haan population study to the 1995 Elliot study of the changes to human acuity as a function of age clearly shows the average population has better than 20/20 vision until around age 55.

Oddly, I don't think they had commodore computers in 1862.

You can find this in Duane's Clinical Ophthalmology, Volume 5.

Even folks that started with less than stellar vision often end up with 20/16 vision. Correcting to 20/16 is common for LASIK procedures.

Quote:
Originally Posted by melgross View Post

Actually, resolution is only a part of the issue. In fact, to a certain extent, it's the least important part. It's why even the best high resolution video games look fake, while a very low Rez photo looks "real". You've got part of that right in your last sentence.

Too bad that resolution was the only part you ridiculed and therefore the most relevant part of the discussion. I would argue that it's not the least important part given that it's tied to what I (and many others) believe IS the most important aspect in immersive experiences: field of view.

Sufficient field of view trumps both contrast and color for immersive video presentation but only if the resolution isn't so pixellated to be distracting and pulls you out of the immersion effect.
post #138 of 138
Quote:
Originally Posted by e1618978 View Post

It does not matter how far the screen is, the limit of your vision is 3600x1800. That is basically the definition of 20/20 vision, if you can't see that, then you don't have 20/20 vision. Somebody with 20/20 vision can see two lines 1 arc minute away from each other, and there are 1800 arc minutes in 30 degrees (i.e. 3600 pixels). If the screen is 10 feet away, the two lines are closer together to be 1 arc minute apart, but the maximum resolution of a 20/20 eye is still 3600x1800.

Like I mentioned before, we don't necessarily watch TV with a neck brace attached to our head that keeps us looking in one direction. We can eventually benefit from higher resolution than this.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: iPod + iTunes + AppleTV
AppleInsider › Forums › Mobile › iPod + iTunes + AppleTV › Apple selling half a million Apple TVs per quarter but no update planned for Q3