or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Apple's 27" iMac only supports native or 720p video input, no 1080p
New Posts  All Forums:Forum Nav:

Apple's 27" iMac only supports native or 720p video input, no 1080p

post #1 of 48
Thread Starter 
Apple's revised 27" iMac introduced this spring debuted a new feature: a Mini DisplayPort video port that worked in both directions, enabling external video sources to output video to the screen. That feature is largely limited to 720p in HDTV applications however.

Excitement about the potential for using the iMac and its large, high quality display as an HDMI HDTV or display for a game console such as the Xbox or PlayStation 3 fizzled after it was revealed that the port only supports DisplayPort input signals, and not the VGA or DVI/HDMI video signals that most external devices use.

This limitation effectively limits the iMac to accepting video input from recent MacBooks or other computers that produce DisplayPort video, which works significantly differently from earlier analog VGA or digital formats such as DVI/HDMI.

That technical chasm can be bridged by a converter box that accepts a DVI/HDMI signal (the two video standards are essentially the same in different packaging), transforms it to DisplayPort signaling, and scales it to the output resolution of the iMac.

A simple physical adapter won't work for video input due to the iMac's DisplayPort-only input limitation; cheap Mini DisplayPort-to-HDMI adapters can only extract the HDMI output signal the iMac generates and pushes through its Mini DisplayPort connector. They do not do any signal translation.

Two products that can do this translation work are the AV360 Mini DisplayPort Converter and Kanex HD, both of which cost $150. However, while those products appear to be capable of generating both 720p and 1080p output, the 27" iMac only accepts 720p video or its native 2560x1440 resolution.

EDID limitation

It appears the 27" iMac could accept 1080p input, and certainly can support display of the video resolution, as it falls well within its 2560x1440 native resolution. A similar problem affects Apple's 24" LED Cinema Display, which has a native 1920x1200 resolution but only supports that resolution via its DisplayPort input; like the 27" iMac, it won't accept a 1080p signal (1920x1080), the common format of higher end HD equipment such as HDMI set top boxes and the PS3, even though it appears it should be able to.

The problem is that Apple's EDID (Extended Display Identification Data) on the iMac and LED Cinema Display doesn't advertise 1080p as an option. EDID is a simple data structure a display sends to output devices that outlines what video formats and settings it knows how to support. Both devices appear capable of 1080p but simply don't advertise that capability in a way that external devices like the AV360 and Kanex HD can take advantage of.

It appears Apple could update the firmware for these displays to enable support of 1080p input, allowing users to input full 1080p video from devices such as the PS3. Users might not notice a major difference, as experts say its hard to see a real difference between 720p and 1080p on screens smaller than 50 inches.

However, some devices are hardwired to only support 1080p, and can't scale their output to support the Cinema Display's slightly higher resolution nor the 27" iMacs much higher resolution, forcing them to downscale to 720p or not work at all (as is the case with the LED Cinema Display, which is really only indended to work with Apple's Mini DisplayPort-equipped MacBooks and modern desktop Macs).
post #2 of 48
I discovered this after receiving my iMac27 a couple of months ago... Not a deal breaker for me personally, but the 'game loving' kids weren't so forgiving.
"Why iPhone"... Hmmm?
Reply
"Why iPhone"... Hmmm?
Reply
post #3 of 48
old news...

...also slightly inaccurate. Apple didn't build a scaler into the iMac capable of converting 1080p.

but you can have 1080p support with this device...

http://www.atlona.com/ATLONA-HDMI-MI...-SWITCHER.html

basically your adding the scaler apple left out.
post #4 of 48
I have a small issue with the claim that experts say you can't notice the difference between 720 and 1080 on screens under 50".

The claim misses the other half of the equation which is viewing distance from the screen. The 50"+ requirement is based on normal lounge room viewing distances, not sitting at a desk with a screen no more than 1.5 foot from your eyes.

The formula is known as the Lechner Distance. A calc table is available at

The table lists (for 1080) 37.66 as optimal viewing distance for 24" and 42.37" for 27". Both are beyond the 1.5-2' viewing distance of a iMac or Display on a desk and as such mean you can easily see all the detail. At 27" you can be a metre back and still see 1080p

http://hdguru.com/lechner-distance-t...ng-an-hdtv/21/

That line in this article is incorrect.
you only have freedom in choice when you know you have no choice
Reply
you only have freedom in choice when you know you have no choice
Reply
post #5 of 48
So...

I like Apple products...a lot. But I've never really understood the value of an Apple display. I like they they seem to be made from better components than other monitors, but it seems like there is a whole helluva lot of stuff that they don't do.

At first, I thought, well, the trade off is that they are made from such nice materials and offer such a great display. Lately, it seems like the displays aren't even that great.

Can someone please just tell me what most people use an Apple display for? To replace a tv? As a second monitor for design stuff (as if you needed more space if you have a 27" monitor)? This is one of those Apple products that I simply can't get my mind around.

Thanks in advance.
post #6 of 48
"Users might not notice a major difference, as experts say its hard to see a real difference between 720p and 1080p on screens smaller than 50 inches. "


Those are blind experts. I see HUGE difference on 46 inch TV set. If someone cannot see it on high quality monitor with certain angular resolution (proper distance) then he/she must have problems with vision. I would suggest visit with optician.

Yes I can see the difference also on 20 inch monitor. As long as resolution for monitor is even or higher than certain HDTV standard with proper ambient light and good display of colors one should see the difference in details between 720p and 1080p. Otherwise it is like saying that it is hard to hear difference between surround and stereo systems.
post #7 of 48
Quote:
Originally Posted by acrobratt View Post

So...

I like Apple products...a lot. But I've never really understood the value of an Apple display. I like they they seem to be made from better components than other monitors, but it seems like there is a whole helluva lot of stuff that they don't do.

At first, I thought, well, the trade off is that they are made from such nice materials and offer such a great display. Lately, it seems like the displays aren't even that great.

Can someone please just tell me what most people use an Apple display for? To replace a tv? As a second monitor for design stuff (as if you needed more space if you have a 27" monitor)? This is one of those Apple products that I simply can't get my mind around.

Thanks in advance.

there is a huge price jump when you go to displays that are higher rez than 1080p because they require a lot of extra scaling hardware to work at different resolutions. Also they require dual link DVI. Thats why most consumer 27 inch displays only do 1080p. The iMac uses a very high quality IPS panel at 2560 x 1440.

The advantage of the imac is that it can scale its output with the GPU, therefore it can operate without external scaling hardware, which makes it more compact and energy efficient.

Things get a little more complicated when adding an external input. They can't use the GPU for scaling because your not allowed to pass HDCP through the computer component.

Basically they would have to add a whole separate (and somewhat powerful) scaling processor to upscale to the displays native resolution with any decent level of quality. The reason 720p works is because it's exactly half of the iMac's native display resolution, therefore you can scale simply by doubling the pixels.

720p x 2 = 2560 X 1440, the native resolution of the iMac's display.

You cant do that with 1080p, therefore it doesn't work. Apple want's to keep the cost down and they want the computer to be compact. Adding 1080p would require a lot of extra hardware.

Apple added external input so when the computer is out of date, you can still use the display. It's not really meant for using it as a tv but hey for $300 you can get HDMI 1080p, and for $150 you can add HDMI 720p.
post #8 of 48
I find this article fairly misleading. My MacBook Pro can drive my 27" iMac at any resolution, not just 720P. If my MacBook can do it, then and adapter can do it.
post #9 of 48
"Users might not notice a major difference, as experts say its hard to see a real difference between 720p and 1080p on screens smaller than 50 inches. "

To those who have brought up the issue about 720P v 1080P, thank you. I'm really sick of this "you can't even see the difference in most cases" crap being perpetuated without adequate explanation or context. It is ridiculous to talk about comparing resolutions without even mentioning viewing distance, which is the most important factor along with dot pitch.

Regarding this issue in particular, there is more to it than just not getting to see 1080P sourced material.

Devices that cannot output 720P end up outputting 1080P, when then gets downscaled by the converter to 720P, and then upscaled to 2560x1440. This added conversion surely has to reduce the quality of the image. It would be far better to go straight from 1080P source material all the way to the monitor with no scaling.
post #10 of 48
Quote:
Originally Posted by ronparr View Post

I find this article fairly misleading. My MacBook Pro can drive my 27" iMac at any resolution, not just 720P. If my MacBook can do it, then and adapter can do it.

Actually, your MacBook is driving the 27 inch iMac at only it's native resolution. The gpu in your MacBook is simply upscaling the lower resolution image before sending it out via displayport. Adapters have another problem. They must convert with hdcp. Your MacBook doesn't.
post #11 of 48
Quote:
Originally Posted by AppleInsider View Post


The problem is that Apple's EDID (Extended Display Identification Data) on the iMac and LED Cinema Display doesn't advertise 1080p as an option. EDID is a simple data structure a display sends to output devices that outlines what video formats and settings it knows how to support. Both devices appear capable of 1080p but simply don't advertise that capability in a way that external devices like the AV360 and Kanex HD can take advantage of.



Nobody really needs 1080. With that size display, you can't see the difference.You can't even get 1080p at the iTunes store.

I really don't care one bit about 1080-p it is all marketing hype. 720 IS HD.
post #12 of 48
Quote:
Originally Posted by BDBLACK View Post

720p x 2 = 2560 X 1440, the native resolution of the iMac's display.

You cant do that with 1080p, therefore it doesn't work.


Apple was smart to do it that way instead of having 1080p as their native resolution.

I don't think that anybody can really see the supposed difference anyways. Its like those golden ears who used to say they could hear all kinds of differences in stereos when really they all pretty much sound the same.
post #13 of 48
...why would you imagine that since you can't hear it, that it is untrue...are you that silly?
post #14 of 48
Quote:
Originally Posted by BDBLACK View Post

there is a huge price jump when you go to displays that are higher rez than 1080p because they require a lot of extra scaling hardware to work at different resolutions. Also they require dual link DVI. Thats why most consumer 27 inch displays only do 1080p. The iMac uses a very high quality IPS panel at 2560 x 1440.

The advantage of the imac is that it can scale its output with the GPU, therefore it can operate without external scaling hardware, which makes it more compact and energy efficient.

Things get a little more complicated when adding an external input. They can't use the GPU for scaling because your not allowed to pass HDCP through the computer component.

Basically they would have to add a whole separate (and somewhat powerful) scaling processor to upscale to the displays native resolution with any decent level of quality. The reason 720p works is because it's exactly half of the iMac's native display resolution, therefore you can scale simply by doubling the pixels.

720p x 2 = 2560 X 1440, the native resolution of the iMac's display.

You cant do that with 1080p, therefore it doesn't work. Apple want's to keep the cost down and they want the computer to be compact. Adding 1080p would require a lot of extra hardware.

Apple added external input so when the computer is out of date, you can still use the display. It's not really meant for using it as a tv but hey for $300 you can get HDMI 1080p, and for $150 you can add HDMI 720p.

Thanks. That's helps. I never really used my computer for more than...well, a computer. This makes more sense to me now.
post #15 of 48
Quote:
Originally Posted by AppleInsider View Post

Apple's revised 27" iMac introduced this spring debuted a new feature: a Mini DisplayPort video port that worked in both directions, enabling external video sources to output video to the screen. That feature is largely limited to 720p in HDTV applications however...

Thank you Apple. Another reason for me to stay a little bit happier with my late 2008 aluminium iMac.... which does not accept any video input at all.

:-(
post #16 of 48
Quote:
Originally Posted by winterspan View Post

"Users might not notice a major difference, as experts say its hard to see a real difference between 720p and 1080p on screens smaller than 50 inches. "

To those who have brought up the issue about 720P v 1080P, thank you. I'm really sick of this "you can't even see the difference in most cases" crap being perpetuated without adequate explanation or context. It is ridiculous to talk about comparing resolutions without even mentioning viewing distance, which is the most important factor along with dot pitch.

Regarding this issue in particular, there is more to it than just not getting to see 1080P sourced material.

Devices that cannot output 720P end up outputting 1080P, when then gets downscaled by the converter to 720P, and then upscaled to 2560x1440. This added conversion surely has to reduce the quality of the image. It would be far better to go straight from 1080P source material all the way to the monitor with no scaling.

The converter I posted a link to converts 1080p directly. Nothing gets downscaled. The 720p adapters don't upscale either. They output 720p and the iMac simply doubles the pixles.
post #17 of 48
Quote:
Originally Posted by DaHarder View Post

I discovered this after receiving my iMac27 a couple of months ago... Not a deal breaker for me personally, but the 'game loving' kids weren't so forgiving.

I am confused here how does 1080p affect games at all ?

do we use the xbox as the console and the mac as the screen ??



help me please my kids are killing me


9
whats in a name ? 
beatles
Reply
whats in a name ? 
beatles
Reply
post #18 of 48
Quote:
Originally Posted by SendMe View Post

Nobody really needs 1080. With that size display, you can't see the difference.You can't even get 1080p at the iTunes store.

I really don't care one bit about 1080-p it is all marketing hype. 720 IS HD.

Because you are one happy Apple fan (and apparently 50y+ wearing strong glasses, because you can't see the difference) and smile when ever they sell you old hardware for high price doesn't mean there aren't other people who can think. 720p is HD Ready.
post #19 of 48
Quote:
Originally Posted by gotApple View Post

Because you are one happy Apple fan (and apparently 50y+ wearing strong glasses, because you can't see the difference) and smile when ever they sell you old hardware for high price doesn't mean there aren't other people who can think. 720p is HD Ready.

For one thing, the consoles typically only render 720p and then upscale the content but also, the iMac screen is smaller than a typical HDTV so when you sit at the same distance you would from a TV then you certainly shouldn't notice a difference.

Quote:
Originally Posted by brucep

I am confused here how does 1080p affect games at all ?

do we use the xbox as the console and the mac as the screen ??

Yeah, you use the xbox as a console and the Mac as the screen. You can do the same thing with an Elgato adaptor:

http://www.youtube.com/watch?v=XDV7SB-pJ4o

Direct input would have less lag though.

Quote:
Originally Posted by brucep

help me please my kids are killing me

The police are on their way.
post #20 of 48
Were the iMacs revised this Spring? I thought they were introduced last October and there has yet to be a refresh. Did I miss something?
post #21 of 48
Quote:
Originally Posted by cy_starkman View Post

I have a small issue with the claim that experts say you can't notice the difference between 720 and 1080 on screens under 50".

The claim misses the other half of the equation which is viewing distance from the screen. The 50"+ requirement is based on normal lounge room viewing distances, not sitting at a desk with a screen no more than 1.5 foot from your eyes.

The formula is known as the Lechner Distance. A calc table is available at

The table lists (for 1080) 37.66 as optimal viewing distance for 24" and 42.37" for 27". Both are beyond the 1.5-2' viewing distance of a iMac or Display on a desk and as such mean you can easily see all the detail. At 27" you can be a metre back and still see 1080p

http://hdguru.com/lechner-distance-t...ng-an-hdtv/21/

That line in this article is incorrect.


That stinks as I got 1080i and of course 720p on my 24-27? From HP with hdmi.

I will say this though, for sitting fairly close on a desk, when I had HD cable hooked up, some channels were 720p, others 1080i and nearly evetytime, the 720p looked better. It was like looking through glass. 1080i seemed grainy. Keep in mind I work in audio, video proffesionally and was even offered a job once by Paul, the owner of Groove Tubes as I had one of his Solo 150 watt amp heads that had a warm, round, glassy sound when used with 1950/60 American coke bottle power section. He came in on a saterday, opened just for me (his wherehouse), they got another head, sounded the same, got another, sounded the same, one more, yup, sounded the same and I explained that it had lost this shimmer so convinceingly, he got one more and sure enough, there it was. All the others had a tiny blown capacitor and that's when he jokingly? offered me the job as I hear and see tgings, others may miss. So I guess 720 is cool. It's just to bad there is no HDMI yet. Booo. LOL. PEACE
post #22 of 48
Quote:
Originally Posted by SendMe View Post

Nobody really needs 1080. With that size display, you can't see the difference.You can't even get 1080p at the iTunes store.

I really don't care one bit about 1080-p it is all marketing hype. 720 IS HD.

Actually, based on the original definition of HD, which was >1000 vertical lines, 720 is NOT HD. The only reason they get away calling it HD is because since it's progressive scan, they double the 720 to 1440 and say, "well that's more than 1000, so it's HD."

Quote:
Originally Posted by SendMe View Post

Apple was smart to do it that way instead of having 1080p as their native resolution.

I don't think that anybody can really see the supposed difference anyways. Its like those golden ears who used to say they could hear all kinds of differences in stereos when really they all pretty much sound the same.

Only if you're deaf. There are substantial differences in audio based on the quality of the equipment that you have. If you're only listening in the background, it doesn't matter. But for traditional foreground listening and moderate to loud levels, it makes a big difference. I'm not claiming that the $50,000 systems are 50x better than a $1000 system, but there are substantial difference in audio systems based on a large number of factors. One of the most important factors in digital reproduction is the quality of the filters that limit the frequency response to 22KHz (just under half of the CD sampling rate of 44.1KHz). Phasing is also a big issue - if audio from the woofer gets to you slower than the audio from the tweeter, for example.

And I'm very surprised that people can't tell the difference because I can spot whether something is playing back at 720p as opposed to 1080p in just a second or two on a large monitor. It might not matter on a laptop, but it matters everywhere else.
post #23 of 48
Quote:
Originally Posted by acrobratt View Post

Can someone please just tell me what most people use an Apple display for? To replace a tv? As a second monitor for design stuff (as if you needed more space if you have a 27" monitor)? This is one of those Apple products that I simply can't get my mind around.

In this case, it's a built-in display to the iMac, you seem to be talking about the separate LED Cinema display. The HDMI input offers a way to simplify tight quarters (think dorm or small apartment), so the iMac can stand in for a wider range of uses rather than needing to also have a TV.

It is also an LED backlit IPS or PVA display, which offers better colors and much better viewing angles than most consumer displays. You don't say what features you wanted that aren't offered in Apple's displays.

Quote:
Originally Posted by SendMe View Post

Nobody really needs 1080. With that size display, you can't see the difference.You can't even get 1080p at the iTunes store.

I really don't care one bit about 1080-p it is all marketing hype. 720 IS HD.

Few people are saying 720p isn't HD. Nobody "needs" 1080p, but because you can't see the difference doesn't mean others can't. 720p is currently the most practical for internet use, but cable, satellite and Blu-Ray offer it because they have sufficient bandwidth to make it work. Incidentally, the iMac has a higher than 1080p screen as well, so somebody must be able to see higher than 1080p for it to be useful that way.

Quote:
Originally Posted by Avidfcp View Post

I will say this though, for sitting fairly close on a desk, when I had HD cable hooked up, some channels were 720p, others 1080i and nearly evetytime, the 720p looked better. It was like looking through glass. 1080i seemed grainy.

You're right, 1080i can be bad. 1080i is a difficult beast, it can look better than 720p, but for movies, it requires a good deinterlacer to line up the fields. Less complicated equipment might just use the "bob" deinterlacing method, which is roughly equivalent to display it the video as 540p.
post #24 of 48
Quote:
Originally Posted by wdw1234 View Post

...why would you imagine that since you can't hear it, that it is untrue...are you that silly?

Quote:
Originally Posted by zoetmb View Post

Only if you're deaf. There are substantial differences in audio based on the quality of the equipment that you have. If you're only listening in the background, it doesn't matter. But for traditional foreground listening and moderate to loud levels, it makes a big difference. I'm not claiming that the $50,000 systems are 50x better than a $1000 system, but there are substantial difference in audio systems based on a large number of factors. One of the most important factors in digital reproduction is the quality of the filters that limit the frequency response to 22KHz (just under half of the CD sampling rate of 44.1KHz). Phasing is also a big issue - if audio from the woofer gets to you slower than the audio from the tweeter, for example.

And I'm very surprised that people can't tell the difference because I can spot whether something is playing back at 720p as opposed to 1080p in just a second or two on a large monitor. It might not matter on a laptop, but it matters everywhere else.

Quote:
Originally Posted by maciekskontakt View Post

Those are blind experts. I see HUGE difference on 46 inch TV set. If someone cannot see it on high quality monitor with certain angular resolution (proper distance) then he/she must have problems with vision. I would suggest visit with optician.

Yes I can see the difference also on 20 inch monitor. As long as resolution for monitor is even or higher than certain HDTV standard with proper ambient light and good display of colors one should see the difference in details between 720p and 1080p. Otherwise it is like saying that it is hard to hear difference between surround and stereo systems.

You're confusing a tiny percentage of people with 'most people'. Apple isn't going to make a product that only 0.01% of people can benefit from. For the majority of people, there's little or no difference between 720P and 1080p on a 27" screen. With the emphasis on 'NO difference".

Heck, lots of people have a hard time telling the difference between DVD and HD even on a 50" screen. It's just not the overwhelming difference that you're pretending. For the majority of people, they can enjoy a movie equally well either way. It's not like the transition from B/W to color or over the air broadcasts to cable or VHS to DVD. It's a very subtle difference.

I love the way people bring in the $50 K audio systems. That's a great example, I'm willing to bet that 99% of the population can't even tell the difference between a $50 K system and a good $2 K system - and most of the rest would have to struggle to identify the differences. You will notice, for example, that when someone does a double blind comparison of the different systems, it almost always comes out 'no difference'. These are the same nuts who brought us green magic markers around the edges of CDs.

Admittedly, there may be an insignificant number of people who really do have hearing (or visual) acuity to distinguish the kind of miniscule differences you're talking about, but why should Apple build a system that only a few hundred people can see? Go ahead and find someone who's willing to make systems that meet your quality standards on a 27" screen for $20,000. Perhaps they'll sell a few dozen systems.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #25 of 48
Quote:
Originally Posted by bdblack View Post

old news...

...also slightly inaccurate. Apple didn't build a scaler into the imac capable of converting 1080p.

But you can have 1080p support with this device...

http://www.atlona.com/atlona-hdmi-mi...-switcher.html

basically your adding the scaler apple left out.

@ ONLY $279.00
Amazing, isn't?
sent from my... internet browser of choice.
Reply
sent from my... internet browser of choice.
Reply
post #26 of 48
Quote:
Originally Posted by 2992 View Post

@ ONLY $279.00
Amazing, isn't?

Really, its not worth it. The only real application where anyone would need 1080p on the 27 inch iMac is with bluray players. Almost all xbox and playstation games are limited to 720p, and broadcast television is generally 720p or 1080i and 720p will look much better on that display because it scales evenly (1 pixel is displayed as 4 pixels).

Making 1080p fit a 1440p display really isn't all that easy. LCD's only work well at their native resolutions.Even when you use a high quality $300 scaler to convert 1080 to 1440, your going to loose sharpness in the image because it doesn't scale evenly.

Most consumer LCD's i've seen usually look really ugly when trying to upscale lower resolutions. Basically they double a percentage of the scan lines to fill the gaps. This makes some pixels bigger than others and looks really ugly. I think it's really a good thing that the iMac's display can upscale 720p so nicely. The only other option would have been to make the display 1080p natively which would lower the DPI significantly, and compromise its usability as a computer display.
post #27 of 48
Quote:
Originally Posted by zoetmb View Post

Actually, based on the original definition of HD, which was >1000 vertical lines, 720 is NOT HD. The only reason they get away calling it HD is because since it's progressive scan, they double the 720 to 1440 and say, "well that's more than 1000, so it's HD."

...

Completely and totally wrong. HDTV in most of North America is defined by the ATSC. The ATSC defines 1080i and 1080p with a ration of 16:9 as HD. Also HD is 1080p with specifications that I won't get into. In other countries, other standards bodies have different but similar definitions. There is no such thing as Full HD except as a marketing term.

As to the merit of 720p, it is great for rapid motion making it the choice for broadcaster that concentrate on sports. This is why ABC/ESPN and FOX chose it. CBS and NBC chose 1080i because it places more pixels on the screen each second. Because it is interlaced, however, 1080i may suffer dot-crawl like NTSC.

Now to the issue of scalers: HDTV is TV. Whether 1080i or 720p, all HDTV assumes that the image may be scaled between the camera and display. You simply have no guarantee that the video that was recorded in 1080i will be displayed in 1080i. It may very be scaled to 720p and rescaled to 1080p and rescaled again. You don't even have a guarantee that the display monitor will be limited to 720p or 1080i or some integral multiple or division thereof. For example, many 720p displays are actually 1366 x 768 and require scaling to display 720p video.
post #28 of 48
Quote:
Originally Posted by jragosta View Post

I love the way people bring in the $50 K audio systems. That's a great example, I'm willing to bet that 99% of the population can't even tell the difference between a $50 K system and a good $2 K system - and most of the rest would have to struggle to identify the differences. You will notice, for example, that when someone does a double blind comparison of the different systems, it almost always comes out 'no difference'. These are the same nuts who brought us green magic markers around the edges of CDs.

Ok, the green magic marker thing was pretty ridiculous as were a number of those other audiophile "tweaks", but I seriously doubt that the results from a "double-blind" listening test between a $50k system and a $2k system would be "no difference". Back in the day when I worked for a high-end A/V outfit, many of my customers would tell me "I can't hear the difference between different speakers" before even listening to them. After I played a demo for them, every single one (literally) said that they heard a difference and most of the time preferred the more expensive speaker. Not saying that the more expensive one always sounded better, but just that they always heard a difference. What I found was that this difference was most pronounced between the "cheap" or low-end speakers and the mid-line stuff. Typically, this was between the $500-$1,000 speakers and the $2,000 and up speakers. As you went up in price, you typically got proportionally less and less improvement for the extra dollars spent. I would have a hard time telling the difference between $5,000 and $10,000 speakers. However, the difference between a $2,000 system and a $50,000 system would still be noticeable and yes, the majority would agree that the $50,000 system sounded better. It wouldn't be night and day, but there would be a difference and I'd bet most people, whether they admit it or not, could hear the difference. I'm not saying you are wrong, just that I'd shift the example to one between a $7,500-$10,000 system and a $50,000 system. The other thing to consider is that, though I think most people could hear the difference, most wouldn't care or be willing to justify the price. This is where we agree, Apple just can't be bothered catering to a minuscule percentage of buyers who want the extra bit of performance that the majority of buyers either don't care about or can't appreciate/take advantage of.
post #29 of 48
Quote:
Originally Posted by Mr. Me View Post

CBS and NBC chose 1080i because it places more pixels on the screen each second. Because it is interlaced, however, 1080i may suffer dot-crawl like NTSC.

1080i video may suffer from a number of artifacts, but dot-crawl is not one of them. Dot-crawl is the result of crosstalk between the luminance and chrominance part of a composite video signal where the two are mixed together. Since the signal for 1080i HD is either in component form (if analog) or HDMI is digital, this is simply not an issue. Typically, the artifacts that are most noticeable with 1080i are motion related but to the improper de-interlacing of the signal.
post #30 of 48
I'd like to offer my theory to explain the "I can't see the difference" crowd (of which I am a member).

1. Everybody has their own "comfortable viewing distance" for a given screen size. I'm going to say that given 1 screen and 1 chair in an empty room, you will move that chair so that the screen occupies a certain percentage of your field of view. You'll move closer to a small screen and further from a large screen, but you'll choose a distance that occupies roughly the same portion of your retina (and brain). So lets call that the "comfortable viewing size". Just a theory.

2. We all know some people like to sit closer to the same screen than others. If you like to sit closer, your "comfortable viewing size" will allow you to appreciate pixel definition better than I can.

3. This seems to me to explain why some people MUST have bluray and others CANT see the point.

4. "Don't sit so close - You'll ruin your eyes!"
post #31 of 48
Quote:
Originally Posted by zoetmb View Post

Actually, based on the original definition of HD, which was >1000 vertical lines, 720 is NOT HD. The only reason they get away calling it HD is because since it's progressive scan, they double the 720 to 1440 and say, "well that's more than 1000, so it's HD."

???
720p has 720 lines, not 1440.
1280x720& up is HD.
post #32 of 48
Quote:
Originally Posted by SendMe View Post

Apple was smart to do it that way instead of having 1080p as their native resolution.

I don't think that anybody can really see the supposed difference anyways. Its like those golden ears who used to say they could hear all kinds of differences in stereos when really they all pretty much sound the same.

I find people saying 'only experts can tell the difference' between 720p and 1080p is either total crap or... well, I guess I do graphics stuff a lot, so maybe I'm in that expert category and maybe it's true. But on the audio side, to give an example, I switched between a 'new' Denon receiver and driving the same pair of speakers through an older Aragon amp (everything else the same, Denon as pre-amp) and my wife commented that the old Aragon sounded much better. If you actually A/B compare, you'll find cases where it is obvious. On its own, AAC through whatever cheap setup sounds satisfactory, it's when you switch between that and a good source on decent speakers/amp and your jaw drops. Whether that's worth spending extra is then your personal choice.

And I'm not talking about cryo-frozen electrical outlets or odd things for which nobody can provide a measurement (the voodoo stuff - which you may be referring to... ), but different amps, speakers, D/A stages which make huge differences... try checking out the difference from AAC audio to CD to SACD sometime and even you'll agree.
post #33 of 48
Quote:
Originally Posted by Rob55 View Post

Ok, the green magic marker thing <snip>.... I'm not saying you are wrong, just that I'd shift the example to one between a $7,500-$10,000 system and a $50,000 system. The other thing to consider is that, though I think most people could hear the difference, most wouldn't care or be willing to justify the price. This is where we agree, Apple just can't be bothered catering to a minuscule percentage of buyers who want the extra bit of performance that the majority of buyers either don't care about or can't appreciate/take advantage of.

Exactly. Better components and design will make a difference, definitely with diminishing returns, and at some point it crosses a threshold where it's not worth the difference.

People immediately saying 'well, anyone who claims to hear a difference is lying'... well, I think some cases (green marker, frozen outlets, etc.) are placebo effect, but on actual components that actively take part in making the sound, I think it's crazy to write it off. And for people who do, please actually go to a high-end shop sometime and listen for yourself. You may find that it ruins your current setup for you, or you may find you wouldn't spend that much for the difference, but unless you damaged your hearing, you should notice a difference from a $200 to $2000 pair or from a $2000 to $50,000 pair.

If you try this (or if you have done so in the past) and don't hear a difference and have verified that your hearing is still OK, then please feel free and post about how all that audio stuff is a lie as far as you're concerned, but until then it's just some urban legend thing being repeated.
post #34 of 48
...Wilson Audio Sophia 3...made in the USA and is one of the Wilson family of speakers which would convince even the most crusty flat earther about the potential for exceptionally good sound at higher prices...revered the world over.

...everyone brings out the old chestnuts about cables or green markers but the fundamental fact remains:

from MP3 to CD to 96/24 Audio, the bit rate is increasingly higher so offering increasing levels of resolution, each one placing greater demands on the playback system.
Blueray is far superior to DVD but since we've grown up with low resolution systems we have not had the opportunity to see what we've been missing. I feel sorry for the kiddywinks who have only know 128/256 or whatever lossy bit rate audio...no idea what music can sound like and so have no real passion to know.
post #35 of 48
Quote:
Originally Posted by mrstep View Post

...I think it's crazy to write it off. And for people who do, please actually go to a high-end shop sometime and listen for yourself. You may find that it ruins your current setup for you, or you may find you wouldn't spend that much for the difference, but unless you damaged your hearing, you should notice a difference from a $200 to $2000 pair or from a $2000 to $50,000 pair.

Ah yes, the "ignorance is bliss" effect. It's what makes Bose so successful. If you cater to the masses who want "good enough", you can just watch the $$$ roll in.

Quote:
Originally Posted by wdw1234 View Post

...I feel sorry for the kiddywinks who have only know 128/256 or whatever lossy bit rate audio...no idea what music can sound like and so have no real passion to know.

So true. A whole generation of listeners constantly being conditioned to settle for less, usually for the sake of convenience.
post #36 of 48
Oh, come on! When I was a kid, 90% of the music we listened to was on the absolutely HORRID cassette tapes. 128mp3 is head and shoulders above that. At least there's no wow and flutter and degraded magnetic tape etc...
post #37 of 48
Quote:
Originally Posted by tonton View Post

Oh, come on! When I was a kid, 90% of the music we listened to was on the absolutely HORRID cassette tapes. 128mp3 is head and shoulders above that. At least there's no wow and flutter and degraded magnetic tape etc...

Yes, but the thing you are forgetting, is that when cassettes were popular, our frame of reference was LP's or even 8-tracks, not 128k MP3/AAC files. In many ways, they were superior to their predecessors. By your logic, 78 RPM records were just fine because they were head and shoulders above the quality of the bees wax cylinders used on the original Edison phonographs in some much as the were less crappy. At the time they were, but by today's standards, they are unlistenable (except from a historical perspective). Today, the 128k MP3/AAC file is the "horrid cassette tape".
post #38 of 48
So my 42" Panasonic plasma is just 740p/1080i and the tv shows no sign of breaking.

I was tempted to get a blue ray player, but I heard how slow the players are at reading and playing both DVD's and Blue Ray disks. I also figured I might not get that much value from watching blue ray content (as opposed to watching upscaled DVD disks) , since I don't have 1080p.

thanks
Jim
post #39 of 48
Quote:
Originally Posted by elasticmedia View Post

So my 42" Panasonic plasma is just 740p/1080i and the tv shows no sign of breaking.

I was tempted to get a blue ray player, but I heard how slow the players are at reading and playing both DVD's and Blue Ray disks. I also figured I might not get that much value from watching blue ray content (as opposed to watching upscaled DVD disks) , since I don't have 1080p.

thanks
Jim

OK. You are conflating a lot of issues. First off, all flat panel displays are progressive-scan irrespective of the source material. Only CRT-based HDTVs--the few that are left--interlace half-frames in real time. All flat panels buffer the first half-frame, interlace it with the second half-frame in memory, and then display each complete frame progressively.

To your larger point. Blu-ray does not display DVDs or Blu-ray discs slowly. Whoever said that they do is full of it. Blu-ray discs take longer to boot-up for a number of reasons. One of the reasons is that the discs contains substantially more data and the data may be more highly compressed. Blu-ray players and DVD players display no noticeable difference in the speed of booting DVDs. Once booted, however, Blu-ray and DVD players play content at the same speed.
post #40 of 48
Thanks Mr. Me for correcting me. What I meant to say was that I heard the players and slow to start playing Blue Ray and that they are also slow to start playing DVD (maybe I am wrong about this last point).

I don't understand the first point MM makes: with my current 1080i tv, a dvd is upscaled to 1080i. With a blue ray player disk, my 1080p blue ray disk would only be displayed as 1080i - my whole question was "how much worse would this be than true 1080p and is it smart to wait until I had an actual 1080p tv?"

Quote:
Originally Posted by Mr. Me View Post

OK. You are conflating a lot of issues. First off, all flat panel displays are progressive-scan irrespective of the source material. Only CRT-based HDTVs--the few that are left--interlace half-frames in real time. All flat panels buffer the first half-frame, interlace it with the second half-frame in memory, and then display each complete frame progressively.

To your larger point. Blu-ray does not display DVDs or Blu-ray discs slowly. Whoever said that they do is full of it. Blu-ray discs take longer to boot-up for a number of reasons. One of the reasons is that the discs contains substantially more data and the data may be more highly compressed. Blu-ray players and DVD players display no noticeable difference in the speed of booting DVDs. Once booted, however, Blu-ray and DVD players play content at the same speed.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Apple's 27" iMac only supports native or 720p video input, no 1080p