HD Resolution

Posted:
in General Discussion edited January 2014
What exactly is the resolution of high definition video, in terms of pixels? On Apple's 23-inch Cinema HD Display's web site, it says that its 1920 x 1200 resolution is "more than enough to support high-definition (HD) content in its native format," but it never states what the exact resolution of HD is.

Comments

  • Reply 1 of 11
    cubedudecubedude Posts: 1,556member
    In television, there are three resolutions:



    1080i

    720p

    480p/i



    P= progressive

    I= interlaced



    HD is 1080, and that's about all I know about it.
  • Reply 2 of 11
    eugeneeugene Posts: 8,254member
    Both 720p (typically 1280x720) and 1080i (typically 1920x1080) are HD resolutions.
  • Reply 3 of 11
    cosmonutcosmonut Posts: 4,872member
    Here's some other basic info. I got it from doing a very SIMPLE GOOGLE SEARCH.



    http://www.digitalconnection.com/FAQ/HDTV_6.asp



    ahem.
  • Reply 4 of 11
    1080i is an interlaced resolution where, if both off and even fields were being displayed at the exact same time, which they never are, you would have a pixel count of about 1920x1080. Since it is interlaced though, at any given point in time, you only have an effective resolution of about 960x1080. This is why a lot of people are happier with 720p, which has a progressively scanned resolution of 1280x720 at any given point in time. Your frame rates differ between the standards as well.



    So Apple's Cinema HD display is more than ample. If you were working on a 1080i composite in combustion, you would be working with a canvas that was 1920x1080, but when that gets tossed back out to tape for broadcast, your animation is split into the odd and even fields.
  • Reply 5 of 11
    eugeneeugene Posts: 8,254member
    Quote:

    Originally posted by LoCash

    Since it is interlaced though, at any given point in time, you only have an effective resolution of about 960x1080.



    Er...



    That would be 1920x540. The interlacing is done with the horizontal scan lines as expected.
  • Reply 6 of 11
    Quote:

    Originally posted by Eugene

    Er...



    That would be 1920x540. The interlacing is done with the horizontal scan lines as expected.




    Crap, thanks, dunno how I botched that one.
  • Reply 7 of 11
    matsumatsu Posts: 6,558member
    And a quick calculation shows that 1080i pumps out more pixels than 720p.



    62 208 000 per sec (1080i)



    55 296 000 per sec (720p)



    I don't think a lot of people are happier with 720p, they're just happier with anything above NTSC/PAL. Even digital SD get's raves when shown next to the average analogue signal.



    The increased horizontal res of 1080i, and the efficiency of the interlacing just destroy any other broadcast format to date. To really see 1080i right now you need to view a D-VHS source. It looks good, particularly on larger screens.



    Blu-ray or whatever will bring it to consumers in a few years.
  • Reply 8 of 11
    bartobarto Posts: 2,246member
    Quote:

    Originally posted by Matsu

    I don't think a lot of people are happier with 720p, they're just happier with anything above NTSC/PAL. Even digital SD get's raves when shown next to the average analogue signal.



    Maybe when compared to the average NTSC signal. The difference in quality between PAL and digital SD is pretty slim.



    Barto
  • Reply 9 of 11
    eugeneeugene Posts: 8,254member
    Yes, 1080i does pump out more data (19 mbps vs 15 mbps), but it's really a matter of what type of content you're watching. ABC and FOX rely on sports programming more than NBC and CBS, so they decided against interlacing...



    BUT...

    As many HDTV monitor manufacturers have forgone fully native 720p support anyway, the benefits of broadcasting with that standard are moot to most viewers.
  • Reply 10 of 11
    I don't think most consumers know the difference between progressive and interlaced enough to be happier with one over the other. From a technical standpoint, you have people in the industry that want to see a progressively scanned picture of slightly lesser resolution for very obvious reasons.



    When I went out to the NAB for Turner in 1998, it was when broadcast HD kit was first really being widely shown. I remember looking at two sets, one was 1080i and the other 720p. I personally found the 1080i to look better, but there are obvious technical reasons I would much, MUCH rather have a progressive signal.
  • Reply 11 of 11
    eugeneeugene Posts: 8,254member
    There's also another issue here. Most local stations here are multicasting off the same frequency. For example, a local independent station has a traffic camera pointed at the Golden Gate bridge on one sub-channel. The ABC affiliate has its HD channel, an SD DTV channel, and also a doppler weather radar + news radio stream. The local PBS station has 5 sub-channels each showing different programs during the day.



    This cuts up the available bandwidth to the point where macro-blocking becomes a big issue. Because 720p takes up less bandwidth, it's not as noticeable.



    It's not the format to blame, but the stations trying to send us quantity over quality...
Sign In or Register to comment.