Comparing DVD to HDTV pixel resolution
So I did the math, this being the year of HD as our iLeader has decreed. the maths is quite 'quick and dirty' so it's a good chance for you (all of us) geeks to do math and discuss
I was itching for access to high-definition content so I took hacked a sweet broadcast graphic LightWave 3D tutorial so I would get a clean computer generated up to 1920x1200 graphic. reply to this post if you want me to link to .zips of the graphics so you can see it for yourself... i'll work on a "illustrative example" and post back here when my broadband is back up.
My best approximation for a DVD anamorphic encoded material is
784x448 pixels. (16:9 approx, 784 and 448 is divisible by 16)
-->0.35 megapixels
Even the lower spec HDTV blows this away with
1280x720 pixels
-->0.92 megapixels
High spec HDTV is amazing
1920x1080 pixels. thats over 2.0 megapixels peoples
I've also realised one thing with regards to the marketing of HDTV in my country
1. 23" is the minimum for any display to really start to appreciate the benefits of 720p over DVDs. edit: 14"-17" DVD quality is fine, and 19"-20" is where one will start to really 'feel' the impact of 720p.
2. people will start to be more impressed once the stores actually *show high-def material* rather than just popping in a DVD and sticking a tag on the plasma saying 'HDTV ready'. of course, without blu-ray, they have to run a proprietary HD signal for the demo plasma/lcd screen... at the computer shops a dude was at least using 'windows media high definition (WMVHD)' and just at 720p, it was impressive.
Welcome to the year of HD
Bring on Tiger, and H.264 or Pixlet encoded 720p and 1080p !!!! YEAH !!!!11!11!! (sorry, couldn't resist the 1s and !!s).. edit: i'm a bit chuffed that windowsMedia HD beat us Mac users to the punch so to speak, but then Pixlet has been out for quite a while, but mainly in the pro scene... apple doesn't really have high-def examples at the moment, despite this being the "year of HD"
me iBook g4 933mhz 256mb ram no handle HighDef video so good though. DVD resolutions encoded at MPEG2 or XVID is the best for now which i will call "448p" because it makes me feel that bit closer to 720p hdtv... i can go around saying my iBook G4 does 448p enhanced-definition tv... which is totally made up terms
i am definitely going to try my best to make sure though that when my dad gets a big screen, he has a Mac-based home theatre strategy/roadplan in place. big screens just made for TV signals would be a foolish play, betting on the computer side of TV-computer convergence is where the smart money is
I was itching for access to high-definition content so I took hacked a sweet broadcast graphic LightWave 3D tutorial so I would get a clean computer generated up to 1920x1200 graphic. reply to this post if you want me to link to .zips of the graphics so you can see it for yourself... i'll work on a "illustrative example" and post back here when my broadband is back up.
My best approximation for a DVD anamorphic encoded material is
784x448 pixels. (16:9 approx, 784 and 448 is divisible by 16)
-->0.35 megapixels
Even the lower spec HDTV blows this away with
1280x720 pixels
-->0.92 megapixels
High spec HDTV is amazing
1920x1080 pixels. thats over 2.0 megapixels peoples
I've also realised one thing with regards to the marketing of HDTV in my country
1. 23" is the minimum for any display to really start to appreciate the benefits of 720p over DVDs. edit: 14"-17" DVD quality is fine, and 19"-20" is where one will start to really 'feel' the impact of 720p.
2. people will start to be more impressed once the stores actually *show high-def material* rather than just popping in a DVD and sticking a tag on the plasma saying 'HDTV ready'. of course, without blu-ray, they have to run a proprietary HD signal for the demo plasma/lcd screen... at the computer shops a dude was at least using 'windows media high definition (WMVHD)' and just at 720p, it was impressive.
Welcome to the year of HD
Bring on Tiger, and H.264 or Pixlet encoded 720p and 1080p !!!! YEAH !!!!11!11!! (sorry, couldn't resist the 1s and !!s).. edit: i'm a bit chuffed that windowsMedia HD beat us Mac users to the punch so to speak, but then Pixlet has been out for quite a while, but mainly in the pro scene... apple doesn't really have high-def examples at the moment, despite this being the "year of HD"
me iBook g4 933mhz 256mb ram no handle HighDef video so good though. DVD resolutions encoded at MPEG2 or XVID is the best for now which i will call "448p" because it makes me feel that bit closer to 720p hdtv... i can go around saying my iBook G4 does 448p enhanced-definition tv... which is totally made up terms
i am definitely going to try my best to make sure though that when my dad gets a big screen, he has a Mac-based home theatre strategy/roadplan in place. big screens just made for TV signals would be a foolish play, betting on the computer side of TV-computer convergence is where the smart money is
Comments
Originally posted by ipodandimac
yes, hd resolution is much better than dvd res. this is really really old news. your comparison of monitors or displays is a bit misleading because big tv's dont always have a good resolution. this is very apparent if you go into best buy and browse through the hd tv's. the other thing is that simply assigning quality to sizes (in inches) is weak because things like viewing angle and color vary greatly right now in hdtv's. but yes, h.264 and stuff will be nice.
yes i may be behind the 8-ball as it were, or whatever that phrase is, but i've been doing some research and needed to rant
i agree with you, regular people see a big TV and see 'HDTV ready' and they haven't got a clue, aside from whether they can afford the monthly payments. especially if like i mentioned, they might be demoing the big display with just a DVD instead of HD source of some kind. how would joe/jane average know the difference between 720i, 720p, 1080i, 1080p ??
i was actually just making a point that for 720p and above, it seems that you would need 23" or bigger for it to really be worthwhile. said 23" or bigger display would of course need to be spec'ed at 720p or better.
yup it was confusing what i said, 14" to 20" i was mainly referring to computer displays, not television displays... \
and Sunrail I'd love to see the Lightwave render.
Super High Definition TV: 3840 x 2160
High Definition is 1920x1080 pixels.
"Enhanced" definition includes anything between HD and standard definition. There are 9 different versions of Enhanced definition TV.
Digital Video (including DVD): 720x480.
Standard definition: 640x480.
VHS: 220-240 lines of resolution
FOX says it has HD content, but it actually does NOT! They use 1280x720. They call any resolution greater than Standard a High Definition broadcast.
"HDTV Ready" does not necessarily mean a screen is capable of full HDTV Resolution. It can merely accept a high definition signal and down-converts it to the display resolution.
EDIT: Added UHDTV/SHDTV.
1280 x 720p
1920 x 1080i/p
The Holy Grail is 1080p and you do have TVs from Mitsubishi and Samsung that can do this. More on the way this year too.
NTSC is roughly 720 x 480 thus
720p represents a %50 increase in vertical resolution and 1080p would represent %125 increase in vertical resolution.
(Changed post above...)
Originally posted by Playmaker
can someone break down exactly what designates something truely "HD"? You may have covered this somewhere in this thread but it's only done more to confuse me. I'm just looking for a concise explanation that details exactly what makes somethingHD. Thanks.
and Sunrail I'd love to see the Lightwave render.
well i'm glad us 'noobs' can discuss hdtv, 'leets' like ipodimac are to far ahead to bother with such trivialities
HDTV resolution is generally considered to be either
1280x720pixels OR
1920x1080pixels
1280x720 is a resolution of the HDTV standard, and some broadcasters and manufacturers will use this and will claim it to be HDTV, they are not being too misleading to say this.
the 'FULL'-spec HDTV is 1920x1080pixels
yup i'd love to post the LightWave renders here, including one at the 30" cinema display resolution, it's a great visual example. i know i'm being a bit of a tease i swear i'll do my best to get round to it... when i get my f8cking broadband up again... damn crap wireless routers... they suck...
(i'm using Mozilla on a Pentium2-333mhz, 196mb ram, Redhat Linux, dial-up connection) This machine is definitely NOT 'HDTV-ready'
Originally posted by sunilraman
1280x720 is a resolution of the HDTV standard, and some broadcasters and manufacturers will use this and will claim it to be HDTV, they are not being too misleading to say this.
Maybe it is just me, but when I compare #1 and #2, I'd say they are trying to mislead people. I'm a diehard its-full-HDTV-or-else-its-DTV fan. 1280 just doesn't make the cut IMO.
Originally posted by Ebby
Maybe it is just me, but when I compare #1 and #2, I'd say they are trying to mislead people. I'm a diehard its-full-HDTV-or-else-its-DTV fan. 1280 just doesn't make the cut IMO.
Ebby, bad example. That 720 pic is bad quality, it looks bad from an iBook screen, whereas the full-HD pic is flawless, studio/computer generated. To determine the real difference you should have a massive picture and downscale it to both to 1080 and 720, no?
BTW what do you guys think about 720p versus 1080i? Sure, the resolution of 1080i is so much greater that even at half refresh speed it pushes 25% more pixels at you, but would you trade the consistency of the progressive picture for that?
Because scaling hurts image quality on flatscreens, it would obviously be a good idea to prefer the lower resolution that fits your screen if your screen does not really do the higher resolution. Even better if that choice also nets you better refresh rate.
Originally posted by Gon
Ebby, bad example. That 720 pic is bad quality, it looks bad from an iBook screen, whereas the full-HD pic is flawless, studio/computer generated. To determine the real difference you should have a massive picture and downscale it to both to 1080 and 720, no?
BTW what do you guys think about 720p versus 1080i? Sure, the resolution of 1080i is so much greater that even at half refresh speed it pushes 25% more pixels at you, but would you trade the consistency of the progressive picture for that?
Because scaling hurts image quality on flatscreens, it would obviously be a good idea to prefer the lower resolution that fits your screen if your screen does not really do the higher resolution. Even better if that choice also nets you better refresh rate.
ah f8ck it... i've got a nice .zip of TGA files yáll can open in Preview, 13mb but i am going to upload it to my server this evening. then yáll can compare 720 vs 1080, both files are natively generated in 3d at that particular resolution, not just scaled down, etc.
Gon, i'm an elitist bastard, interlacing is just not on anymore, it just doesn't cut it. faced between 720p and 1080i, i would take the 720p. but faced between 720p and 1080p, i say, gime gimee gimee on the 1080p...!
edit: also because i think interlaced video tends to be a bit too 'harsh' and 'sharp' i feel... if and when i can afford a HD camera, i would definitely choose 720progressive 'film like' mode if it has that, rather than a similar priced/featured HD camera that can do 1080 but only 1080interlaced mode.
Gon, yes, good points with the screen size thing...
edit2(slight off topic): now it seems that if you have a video camera that can do 720-60p or 1080-60p, that means that your getting full 60 frames per second. that offers some sweet opportunities for up to almost 3times (60/24) SLOW-MOTION, like the film pros do it, when you 'convert' it somehow to 720-24p or 1080-24p in your edit suite of choice... interesting...
Originally posted by sunilraman
i'm an elitist bastard, interlacing is just not on anymore, it just doesn't cut it. faced between 720p and 1080i, i would take the 720p.
I think it's not an either/or choice.. progressive scan rocks for fast movement and action sequences, but for things like slow dramas you'd probably appreciate higher resolution. It's not like flatscreens will flicker at low Hz... and I think interlacing is a bit better than plain low Hz.
Originally posted by Gon
....but for things like slow dramas you'd probably appreciate higher resolution....
well Bridget in Bold&Beautiful is kinda yummy in a 'dana scully in her younger years' way... she's got a bit of a similar accent... you USAians might now what i'm talking about, it's of a sexy accent, not too hoarse, and a certain way of saying words with 's's in them... yes, Bold&Beautiful or Fear Factor episodes with women in bikinis I guess i would 'lower my standards' to go with 1080interlaced
back on point:
the example uncompressed still frames are here
download, unzip (the file, you pervert!!) and enjoy
rendered in lightwave3d of a modified version of DanAblan's superb tutorials.
http://www.phatcraft.com/high_res_examples.zip
Get a feel of 720p and 1080p high-def compared to DVD resolutions. then fire up the 23" and 30" apple cinema hd -- to use as a background to show off your display or for others, to realise what their missing.
especially for images bigger than your screen res,
i would suggest, open the file in preview, hit command-0 (view actual size), maximise window to full size, then hit command-1 to get the 'hand' tool, and click and drag to scroll around....
all .tga files are rendered and anti-aliased at high-res natively out of lightwave 3d, and uncompressed, so you get a real good comparison out of all of them
ps. no i am not going to do any video or animation, it took like 30minutes for just 1 frame on the 30"cinemahd-res render on my poor iBook 933mhz G4 256mb ram. plus h.264 or pixlet is a whole 'nother debate on HDVideo compression...
lightwave7 on iBook is sweet though for static designs/ DVD-res animations Enjoy
BTW what do you guys think about 720p versus 1080i? Sure, the resolution of 1080i is so much greater that even at half refresh speed it pushes 25% more pixels at you, but would you trade the consistency of the progressive picture for that?
I think 720p is going to be replaced by 1080i and 1080p. I'm already hearing that some HDTV set manf will have some models that may not support 720p. I know ...shocked me too. I gather they will be pushing consumers on 1080 being the larger number.
Me personally I'm eyeballing who does 1080p the best. I've heard reports over on AVSforum that 1080p does indeed look phenomenal with excellent source material. My goal in 5 years is a capable front projection system in a dedicated room so that's where I'm looking.
Doggone it Steve was right! This "is" the year of High Def
However let us not get confused. This is a slight of hand in ways. The TI chip has 960 mirrors horizontally and I think 1080 mirrors vertically. This is being called "wobulation" where on mirror corresponds to 2 pixels.
The ideal sitution is to have a 1:1 mapping but we're a few years away from this becoming affordable. I'll take the wobulation right now as it's still an improvement.
Info on Wobulation
edit: corrected the pre wobulated pixels after further research.
Originally posted by Gon
Ebby, bad example. That 720 pic is bad quality, it looks bad from an iBook screen, whereas the full-HD pic is flawless, studio/computer generated. To determine the real difference you should have a massive picture and downscale it to both to 1080 and 720, no?
Now, hold on a second... That my whole point! These shots were taken from actual broadcasts. #1 is a true HDTV signal from UPN, and #2 is a "HDTV" broadcast from FOX. (The Superbowl nonetheless) You could scale whatever larger images down and get better quality, but that is not what is being broadcast today. I don't like #2 and don't consider it HDTV.
Re-reead my statement if you must.
Originally posted by Gon
Ebby, bad example. That 720 pic is bad quality, it looks bad from an iBook screen, whereas the full-HD pic is flawless, studio/computer generated. To determine the real difference you should have a massive picture and downscale it to both to 1080 and 720, no?
Now, hold on a second... That my whole point! These shots were taken from actual broadcasts. #1 is a true HDTV signal from UPN, and #2 is a "HDTV" broadcast from FOX. (The Superbowl nonetheless) You could scale whatever larger images down and get better quality, but that is not what is being broadcast today. I don't like #2 and don't consider it HDTV. Re-read my statement if you must. As for CGI, I'll get a pic of a Vulcan, or Klingon or something and you can see the makeup. There is no excuse because the image quality is there no matter what the subject.
Originally posted by Ebby
Now, hold on a second... That my whole point! These shots were taken from actual broadcasts. #1 is a true HDTV signal from UPN, and #2 is a "HDTV" broadcast from FOX. (The Superbowl nonetheless) You could scale whatever larger images down and get better quality, but that is not what is being broadcast today. I don't like #2 and don't consider it HDTV. Re-read my statement if you must. As for CGI, I'll get a pic of a Vulcan, or Klingon or something and you can see the makeup. There is no excuse because the image quality is there no matter what the subject.
The pics are good for a very specific demonstration of the respective channels airing their respective programming, but that's it. In your post I replied to, you were complaining specifically about format vs format and for that purpose the pics are useless because the source material is so poor in the 720p picture.
Also, I don't think you can deny that live news or similar footage and a highly polished movie have highly different standards for image quality. You can't produce the same thing in a live, uncontrolled filming environment than a clean studio or outdoor set.
Originally posted by Ebby
Now, hold on a second... That my whole point! These shots were taken from actual broadcasts. #1 is a true HDTV signal from UPN, and #2 is a "HDTV" broadcast from FOX. (The Superbowl nonetheless) You could scale whatever larger images down and get better quality, but that is not what is being broadcast today. I don't like #2 and don't consider it HDTV. Re-read my statement if you must. As for CGI, I'll get a pic of a Vulcan, or Klingon or something and you can see the makeup. There is no excuse because the image quality is there no matter what the subject.
ABC, Fox and ESPN are using 720p. Prior to this summer / fall, Fox was only using 480p for their "HD" channel. CBS and NBC use 1080i.
Possibly that screen capture #2 was from source recorded in Fox's older 480p equipment, not 720p, even though it is broadcast at 720p.
Source material does make a difference, and yes, one should think about it. it also gets complex because that UPN shot is from a computer-generated effects-shot whereas the #2 is from 'live' material recorded with cameras in the stadium. so this adds some complexity to the mix.
I think if you can afford a big-a55 screen, get something rated at 1080p and save yourself a lot of worry down the line. My 'litmus test' is going to be hooking up the display to a computer running at 1920x1080 and see how the display responds, then i know the hdtv can 'handle' whatever signals thrown at it (i mean of course i would want to see as well how the hdtv handles different Fox, ABC, ESPN, CBS, NBC, Cable, Satellite broadcasts of various different shows and movies)
suffice to say the big-screen dealers have only sales targets and profit margins on their minds. well, most of them anyway.
Originally posted by hmurchison
Wow speak of the Devil!! TI announces 1080p chip
Doggone it Steve was right! This "is" the year of High Def
However let us not get confused. This is a slight of hand in ways. The TI chip has 960 mirrors horizontally and I think 1080 mirrors vertically. This is being called "wobulation" where on mirror corresponds to 2 pixels.
The ideal sitution is to have a 1:1 mapping but we're a few years away from this becoming affordable. I'll take the wobulation right now as it's still an improvement.
Info on Wobulation
edit: corrected the pre wobulated pixels after further research.
DLP front/ rear-projection i have found IMHO is a decent way to get into the big-screen HDTV scene without shelling out big bucks for LCD or Plasma displays
well, wobulation is not 'true' 1080p but i guess the wobulation rate is such that it gives you close to 1080p and far enough away from 1080i...
wow, its a HDTV techno-ZOO out there. i pity the older folks and non-tech people (no offense to anyone here thats 60+)
edit: from the popsci llink i learnt that DLP maxes out at 1280x720 without wobulation they might find a non-wobulating way around this somewhere down the line...