Apple's 27" iMac only supports native or 720p video input, no 1080p

2

Comments

  • Reply 21 of 47
    zoetmbzoetmb Posts: 2,654member
    Quote:
    Originally Posted by SendMe View Post


    Nobody really needs 1080. With that size display, you can't see the difference.You can't even get 1080p at the iTunes store.



    I really don't care one bit about 1080-p it is all marketing hype. 720 IS HD.



    Actually, based on the original definition of HD, which was >1000 vertical lines, 720 is NOT HD. The only reason they get away calling it HD is because since it's progressive scan, they double the 720 to 1440 and say, "well that's more than 1000, so it's HD."



    Quote:
    Originally Posted by SendMe View Post


    Apple was smart to do it that way instead of having 1080p as their native resolution.



    I don't think that anybody can really see the supposed difference anyways. Its like those golden ears who used to say they could hear all kinds of differences in stereos when really they all pretty much sound the same.



    Only if you're deaf. There are substantial differences in audio based on the quality of the equipment that you have. If you're only listening in the background, it doesn't matter. But for traditional foreground listening and moderate to loud levels, it makes a big difference. I'm not claiming that the $50,000 systems are 50x better than a $1000 system, but there are substantial difference in audio systems based on a large number of factors. One of the most important factors in digital reproduction is the quality of the filters that limit the frequency response to 22KHz (just under half of the CD sampling rate of 44.1KHz). Phasing is also a big issue - if audio from the woofer gets to you slower than the audio from the tweeter, for example.



    And I'm very surprised that people can't tell the difference because I can spot whether something is playing back at 720p as opposed to 1080p in just a second or two on a large monitor. It might not matter on a laptop, but it matters everywhere else.
  • Reply 22 of 47
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by acrobratt View Post


    Can someone please just tell me what most people use an Apple display for? To replace a tv? As a second monitor for design stuff (as if you needed more space if you have a 27" monitor)? This is one of those Apple products that I simply can't get my mind around.



    In this case, it's a built-in display to the iMac, you seem to be talking about the separate LED Cinema display. The HDMI input offers a way to simplify tight quarters (think dorm or small apartment), so the iMac can stand in for a wider range of uses rather than needing to also have a TV.



    It is also an LED backlit IPS or PVA display, which offers better colors and much better viewing angles than most consumer displays. You don't say what features you wanted that aren't offered in Apple's displays.



    Quote:
    Originally Posted by SendMe View Post


    Nobody really needs 1080. With that size display, you can't see the difference.You can't even get 1080p at the iTunes store.



    I really don't care one bit about 1080-p it is all marketing hype. 720 IS HD.



    Few people are saying 720p isn't HD. Nobody "needs" 1080p, but because you can't see the difference doesn't mean others can't. 720p is currently the most practical for internet use, but cable, satellite and Blu-Ray offer it because they have sufficient bandwidth to make it work. Incidentally, the iMac has a higher than 1080p screen as well, so somebody must be able to see higher than 1080p for it to be useful that way.



    Quote:
    Originally Posted by Avidfcp View Post


    I will say this though, for sitting fairly close on a desk, when I had HD cable hooked up, some channels were 720p, others 1080i and nearly evetytime, the 720p looked better. It was like looking through glass. 1080i seemed grainy.



    You're right, 1080i can be bad. 1080i is a difficult beast, it can look better than 720p, but for movies, it requires a good deinterlacer to line up the fields. Less complicated equipment might just use the "bob" deinterlacing method, which is roughly equivalent to display it the video as 540p.
  • Reply 23 of 47
    jragostajragosta Posts: 10,473member
    Quote:
    Originally Posted by wdw1234 View Post


    ...why would you imagine that since you can't hear it, that it is untrue...are you that silly?



    Quote:
    Originally Posted by zoetmb View Post


    Only if you're deaf. There are substantial differences in audio based on the quality of the equipment that you have. If you're only listening in the background, it doesn't matter. But for traditional foreground listening and moderate to loud levels, it makes a big difference. I'm not claiming that the $50,000 systems are 50x better than a $1000 system, but there are substantial difference in audio systems based on a large number of factors. One of the most important factors in digital reproduction is the quality of the filters that limit the frequency response to 22KHz (just under half of the CD sampling rate of 44.1KHz). Phasing is also a big issue - if audio from the woofer gets to you slower than the audio from the tweeter, for example.



    And I'm very surprised that people can't tell the difference because I can spot whether something is playing back at 720p as opposed to 1080p in just a second or two on a large monitor. It might not matter on a laptop, but it matters everywhere else.



    Quote:
    Originally Posted by maciekskontakt View Post


    Those are blind experts. I see HUGE difference on 46 inch TV set. If someone cannot see it on high quality monitor with certain angular resolution (proper distance) then he/she must have problems with vision. I would suggest visit with optician.



    Yes I can see the difference also on 20 inch monitor. As long as resolution for monitor is even or higher than certain HDTV standard with proper ambient light and good display of colors one should see the difference in details between 720p and 1080p. Otherwise it is like saying that it is hard to hear difference between surround and stereo systems.



    You're confusing a tiny percentage of people with 'most people'. Apple isn't going to make a product that only 0.01% of people can benefit from. For the majority of people, there's little or no difference between 720P and 1080p on a 27" screen. With the emphasis on 'NO difference".



    Heck, lots of people have a hard time telling the difference between DVD and HD even on a 50" screen. It's just not the overwhelming difference that you're pretending. For the majority of people, they can enjoy a movie equally well either way. It's not like the transition from B/W to color or over the air broadcasts to cable or VHS to DVD. It's a very subtle difference.



    I love the way people bring in the $50 K audio systems. That's a great example, I'm willing to bet that 99% of the population can't even tell the difference between a $50 K system and a good $2 K system - and most of the rest would have to struggle to identify the differences. You will notice, for example, that when someone does a double blind comparison of the different systems, it almost always comes out 'no difference'. These are the same nuts who brought us green magic markers around the edges of CDs.



    Admittedly, there may be an insignificant number of people who really do have hearing (or visual) acuity to distinguish the kind of miniscule differences you're talking about, but why should Apple build a system that only a few hundred people can see? Go ahead and find someone who's willing to make systems that meet your quality standards on a 27" screen for $20,000. Perhaps they'll sell a few dozen systems.
  • Reply 24 of 47
    29922992 Posts: 202member
    Quote:
    Originally Posted by bdblack View Post


    old news...



    ...also slightly inaccurate. Apple didn't build a scaler into the imac capable of converting 1080p.



    But you can have 1080p support with this device...



    http://www.atlona.com/atlona-hdmi-mi...-switcher.html



    basically your adding the scaler apple left out.



    @ ONLY $279.00

    Amazing, isn't?
  • Reply 25 of 47
    bdblackbdblack Posts: 146member
    Quote:
    Originally Posted by 2992 View Post


    @ ONLY $279.00

    Amazing, isn't?



    Really, its not worth it. The only real application where anyone would need 1080p on the 27 inch iMac is with bluray players. Almost all xbox and playstation games are limited to 720p, and broadcast television is generally 720p or 1080i and 720p will look much better on that display because it scales evenly (1 pixel is displayed as 4 pixels).



    Making 1080p fit a 1440p display really isn't all that easy. LCD's only work well at their native resolutions.Even when you use a high quality $300 scaler to convert 1080 to 1440, your going to loose sharpness in the image because it doesn't scale evenly.



    Most consumer LCD's i've seen usually look really ugly when trying to upscale lower resolutions. Basically they double a percentage of the scan lines to fill the gaps. This makes some pixels bigger than others and looks really ugly. I think it's really a good thing that the iMac's display can upscale 720p so nicely. The only other option would have been to make the display 1080p natively which would lower the DPI significantly, and compromise its usability as a computer display.
  • Reply 26 of 47
    mr. memr. me Posts: 3,221member
    Quote:
    Originally Posted by zoetmb View Post


    Actually, based on the original definition of HD, which was >1000 vertical lines, 720 is NOT HD. The only reason they get away calling it HD is because since it's progressive scan, they double the 720 to 1440 and say, "well that's more than 1000, so it's HD."



    ...



    Completely and totally wrong. HDTV in most of North America is defined by the ATSC. The ATSC defines 1080i and 1080p with a ration of 16:9 as HD. Also HD is 1080p with specifications that I won't get into. In other countries, other standards bodies have different but similar definitions. There is no such thing as Full HD except as a marketing term.



    As to the merit of 720p, it is great for rapid motion making it the choice for broadcaster that concentrate on sports. This is why ABC/ESPN and FOX chose it. CBS and NBC chose 1080i because it places more pixels on the screen each second. Because it is interlaced, however, 1080i may suffer dot-crawl like NTSC.



    Now to the issue of scalers: HDTV is TV. Whether 1080i or 720p, all HDTV assumes that the image may be scaled between the camera and display. You simply have no guarantee that the video that was recorded in 1080i will be displayed in 1080i. It may very be scaled to 720p and rescaled to 1080p and rescaled again. You don't even have a guarantee that the display monitor will be limited to 720p or 1080i or some integral multiple or division thereof. For example, many 720p displays are actually 1366 x 768 and require scaling to display 720p video.
  • Reply 27 of 47
    rob55rob55 Posts: 1,291member
    Quote:
    Originally Posted by jragosta View Post


    I love the way people bring in the $50 K audio systems. That's a great example, I'm willing to bet that 99% of the population can't even tell the difference between a $50 K system and a good $2 K system - and most of the rest would have to struggle to identify the differences. You will notice, for example, that when someone does a double blind comparison of the different systems, it almost always comes out 'no difference'. These are the same nuts who brought us green magic markers around the edges of CDs.



    Ok, the green magic marker thing was pretty ridiculous as were a number of those other audiophile "tweaks", but I seriously doubt that the results from a "double-blind" listening test between a $50k system and a $2k system would be "no difference". Back in the day when I worked for a high-end A/V outfit, many of my customers would tell me "I can't hear the difference between different speakers" before even listening to them. After I played a demo for them, every single one (literally) said that they heard a difference and most of the time preferred the more expensive speaker. Not saying that the more expensive one always sounded better, but just that they always heard a difference. What I found was that this difference was most pronounced between the "cheap" or low-end speakers and the mid-line stuff. Typically, this was between the $500-$1,000 speakers and the $2,000 and up speakers. As you went up in price, you typically got proportionally less and less improvement for the extra dollars spent. I would have a hard time telling the difference between $5,000 and $10,000 speakers. However, the difference between a $2,000 system and a $50,000 system would still be noticeable and yes, the majority would agree that the $50,000 system sounded better. It wouldn't be night and day, but there would be a difference and I'd bet most people, whether they admit it or not, could hear the difference. I'm not saying you are wrong, just that I'd shift the example to one between a $7,500-$10,000 system and a $50,000 system. The other thing to consider is that, though I think most people could hear the difference, most wouldn't care or be willing to justify the price. This is where we agree, Apple just can't be bothered catering to a minuscule percentage of buyers who want the extra bit of performance that the majority of buyers either don't care about or can't appreciate/take advantage of.
  • Reply 28 of 47
    rob55rob55 Posts: 1,291member
    Quote:
    Originally Posted by Mr. Me View Post


    CBS and NBC chose 1080i because it places more pixels on the screen each second. Because it is interlaced, however, 1080i may suffer dot-crawl like NTSC.



    1080i video may suffer from a number of artifacts, but dot-crawl is not one of them. Dot-crawl is the result of crosstalk between the luminance and chrominance part of a composite video signal where the two are mixed together. Since the signal for 1080i HD is either in component form (if analog) or HDMI is digital, this is simply not an issue. Typically, the artifacts that are most noticeable with 1080i are motion related but to the improper de-interlacing of the signal.
  • Reply 29 of 47
    I'd like to offer my theory to explain the "I can't see the difference" crowd (of which I am a member).



    1. Everybody has their own "comfortable viewing distance" for a given screen size. I'm going to say that given 1 screen and 1 chair in an empty room, you will move that chair so that the screen occupies a certain percentage of your field of view. You'll move closer to a small screen and further from a large screen, but you'll choose a distance that occupies roughly the same portion of your retina (and brain). So lets call that the "comfortable viewing size". Just a theory.



    2. We all know some people like to sit closer to the same screen than others. If you like to sit closer, your "comfortable viewing size" will allow you to appreciate pixel definition better than I can.



    3. This seems to me to explain why some people MUST have bluray and others CANT see the point.



    4. "Don't sit so close - You'll ruin your eyes!"
  • Reply 30 of 47
    chris_cachris_ca Posts: 2,543member
    Quote:
    Originally Posted by zoetmb View Post


    Actually, based on the original definition of HD, which was >1000 vertical lines, 720 is NOT HD. The only reason they get away calling it HD is because since it's progressive scan, they double the 720 to 1440 and say, "well that's more than 1000, so it's HD."



    ???

    720p has 720 lines, not 1440.

    1280x720& up is HD.
  • Reply 31 of 47
    mrstepmrstep Posts: 514member
    Quote:
    Originally Posted by SendMe View Post


    Apple was smart to do it that way instead of having 1080p as their native resolution.



    I don't think that anybody can really see the supposed difference anyways. Its like those golden ears who used to say they could hear all kinds of differences in stereos when really they all pretty much sound the same.



    I find people saying 'only experts can tell the difference' between 720p and 1080p is either total crap or... well, I guess I do graphics stuff a lot, so maybe I'm in that expert category and maybe it's true. But on the audio side, to give an example, I switched between a 'new' Denon receiver and driving the same pair of speakers through an older Aragon amp (everything else the same, Denon as pre-amp) and my wife commented that the old Aragon sounded much better. If you actually A/B compare, you'll find cases where it is obvious. On its own, AAC through whatever cheap setup sounds satisfactory, it's when you switch between that and a good source on decent speakers/amp and your jaw drops. Whether that's worth spending extra is then your personal choice.



    And I'm not talking about cryo-frozen electrical outlets or odd things for which nobody can provide a measurement (the voodoo stuff - which you may be referring to... ), but different amps, speakers, D/A stages which make huge differences... try checking out the difference from AAC audio to CD to SACD sometime and even you'll agree.
  • Reply 32 of 47
    mrstepmrstep Posts: 514member
    Quote:
    Originally Posted by Rob55 View Post


    Ok, the green magic marker thing <snip>.... I'm not saying you are wrong, just that I'd shift the example to one between a $7,500-$10,000 system and a $50,000 system. The other thing to consider is that, though I think most people could hear the difference, most wouldn't care or be willing to justify the price. This is where we agree, Apple just can't be bothered catering to a minuscule percentage of buyers who want the extra bit of performance that the majority of buyers either don't care about or can't appreciate/take advantage of.



    Exactly. Better components and design will make a difference, definitely with diminishing returns, and at some point it crosses a threshold where it's not worth the difference.



    People immediately saying 'well, anyone who claims to hear a difference is lying'... well, I think some cases (green marker, frozen outlets, etc.) are placebo effect, but on actual components that actively take part in making the sound, I think it's crazy to write it off. And for people who do, please actually go to a high-end shop sometime and listen for yourself. You may find that it ruins your current setup for you, or you may find you wouldn't spend that much for the difference, but unless you damaged your hearing, you should notice a difference from a $200 to $2000 pair or from a $2000 to $50,000 pair.



    If you try this (or if you have done so in the past) and don't hear a difference and have verified that your hearing is still OK, then please feel free and post about how all that audio stuff is a lie as far as you're concerned, but until then it's just some urban legend thing being repeated.
  • Reply 33 of 47
    ...Wilson Audio Sophia 3...made in the USA and is one of the Wilson family of speakers which would convince even the most crusty flat earther about the potential for exceptionally good sound at higher prices...revered the world over.



    ...everyone brings out the old chestnuts about cables or green markers but the fundamental fact remains:



    from MP3 to CD to 96/24 Audio, the bit rate is increasingly higher so offering increasing levels of resolution, each one placing greater demands on the playback system.

    Blueray is far superior to DVD but since we've grown up with low resolution systems we have not had the opportunity to see what we've been missing. I feel sorry for the kiddywinks who have only know 128/256 or whatever lossy bit rate audio...no idea what music can sound like and so have no real passion to know.
  • Reply 34 of 47
    rob55rob55 Posts: 1,291member
    Quote:
    Originally Posted by mrstep View Post


    ...I think it's crazy to write it off. And for people who do, please actually go to a high-end shop sometime and listen for yourself. You may find that it ruins your current setup for you, or you may find you wouldn't spend that much for the difference, but unless you damaged your hearing, you should notice a difference from a $200 to $2000 pair or from a $2000 to $50,000 pair.



    Ah yes, the "ignorance is bliss" effect. It's what makes Bose so successful. If you cater to the masses who want "good enough", you can just watch the $$$ roll in.



    Quote:
    Originally Posted by wdw1234 View Post


    ...I feel sorry for the kiddywinks who have only know 128/256 or whatever lossy bit rate audio...no idea what music can sound like and so have no real passion to know.



    So true. A whole generation of listeners constantly being conditioned to settle for less, usually for the sake of convenience.
  • Reply 35 of 47
    Oh, come on! When I was a kid, 90% of the music we listened to was on the absolutely HORRID cassette tapes. 128mp3 is head and shoulders above that. At least there's no wow and flutter and degraded magnetic tape etc...
  • Reply 36 of 47
    rob55rob55 Posts: 1,291member
    Quote:
    Originally Posted by tonton View Post


    Oh, come on! When I was a kid, 90% of the music we listened to was on the absolutely HORRID cassette tapes. 128mp3 is head and shoulders above that. At least there's no wow and flutter and degraded magnetic tape etc...



    Yes, but the thing you are forgetting, is that when cassettes were popular, our frame of reference was LP's or even 8-tracks, not 128k MP3/AAC files. In many ways, they were superior to their predecessors. By your logic, 78 RPM records were just fine because they were head and shoulders above the quality of the bees wax cylinders used on the original Edison phonographs in some much as the were less crappy. At the time they were, but by today's standards, they are unlistenable (except from a historical perspective). Today, the 128k MP3/AAC file is the "horrid cassette tape".
  • Reply 37 of 47
    So my 42" Panasonic plasma is just 740p/1080i and the tv shows no sign of breaking.



    I was tempted to get a blue ray player, but I heard how slow the players are at reading and playing both DVD's and Blue Ray disks. I also figured I might not get that much value from watching blue ray content (as opposed to watching upscaled DVD disks) , since I don't have 1080p.



    thanks

    Jim
  • Reply 38 of 47
    mr. memr. me Posts: 3,221member
    Quote:
    Originally Posted by elasticmedia View Post


    So my 42" Panasonic plasma is just 740p/1080i and the tv shows no sign of breaking.



    I was tempted to get a blue ray player, but I heard how slow the players are at reading and playing both DVD's and Blue Ray disks. I also figured I might not get that much value from watching blue ray content (as opposed to watching upscaled DVD disks) , since I don't have 1080p.



    thanks

    Jim



    OK. You are conflating a lot of issues. First off, all flat panel displays are progressive-scan irrespective of the source material. Only CRT-based HDTVs--the few that are left--interlace half-frames in real time. All flat panels buffer the first half-frame, interlace it with the second half-frame in memory, and then display each complete frame progressively.



    To your larger point. Blu-ray does not display DVDs or Blu-ray discs slowly. Whoever said that they do is full of it. Blu-ray discs take longer to boot-up for a number of reasons. One of the reasons is that the discs contains substantially more data and the data may be more highly compressed. Blu-ray players and DVD players display no noticeable difference in the speed of booting DVDs. Once booted, however, Blu-ray and DVD players play content at the same speed.
  • Reply 39 of 47
    Thanks Mr. Me for correcting me. What I meant to say was that I heard the players and slow to start playing Blue Ray and that they are also slow to start playing DVD (maybe I am wrong about this last point).



    I don't understand the first point MM makes: with my current 1080i tv, a dvd is upscaled to 1080i. With a blue ray player disk, my 1080p blue ray disk would only be displayed as 1080i - my whole question was "how much worse would this be than true 1080p and is it smart to wait until I had an actual 1080p tv?"



    Quote:
    Originally Posted by Mr. Me View Post


    OK. You are conflating a lot of issues. First off, all flat panel displays are progressive-scan irrespective of the source material. Only CRT-based HDTVs--the few that are left--interlace half-frames in real time. All flat panels buffer the first half-frame, interlace it with the second half-frame in memory, and then display each complete frame progressively.



    To your larger point. Blu-ray does not display DVDs or Blu-ray discs slowly. Whoever said that they do is full of it. Blu-ray discs take longer to boot-up for a number of reasons. One of the reasons is that the discs contains substantially more data and the data may be more highly compressed. Blu-ray players and DVD players display no noticeable difference in the speed of booting DVDs. Once booted, however, Blu-ray and DVD players play content at the same speed.



  • Reply 40 of 47
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by elasticmedia View Post


    Thanks Mr. Me for correcting me. What I meant to say was that I heard the players and slow to start playing Blue Ray and that they are also slow to start playing DVD (maybe I am wrong about this last point).



    I don't understand the first point MM makes: with my current 1080i tv, a dvd is upscaled to 1080i. With a blue ray player disk, my 1080p blue ray disk would only be displayed as 1080i - my whole question was "how much worse would this be than true 1080p and is it smart to wait until I had an actual 1080p tv?"



    Do you really have a 1080i plasma that can't take & display a 1080p signal? If so, I didn't know those existed. Do you know the native resolution of the panel?



    Some BDs are slow to load the first time you play them. I think most of the slow loaders speed up the second time around. My Blu-Ray player doesn't load DVDs any slower than a regular DVD player.



    Maybe there are some slow players, I know my HD-DVD player takes 30 seconds to bother opening the tray.
Sign In or Register to comment.