Intel's Ivy Bridge support for 4K resolution could pave way for "Retina" Macs
Intel quietly revealed last week that its next-generation Ivy Bridge processors will support the 4K display resolution, with up to 4096 x 4096 pixels per monitor, potentially paving the way for Apple to introduce high-resolution "Retina Display" Macs.
The world's largest chipmaker announced the news during a technical session at its Intel Developer Forum in San Francisco last week, as noted by VR-Zone. Ivy Bridge chips will rival competing discrete GPUs by including support for the 4K resolution when they arrive next year.
The company also highlighted a Multi Format Codec (MFX) engine that is capable of playing multiple 4K videos at once. The codec is also capable of handling video processing for 4K QuadHD video, a standard that YouTube began supporting last year.
A set of performance enhancements, with special attention to graphics, should give Ivy Bridge as much as a 60 percent performance boost over the current generation of Sandy Bridge chips, according to Intel.
Intel also revealed last week that Ivy Bridge chips will include support for Apple's OpenCL standard, which should give a performance boost to next-generation MacBook Air and 13-inch MacBook Pro models when they arrive in 2012.
If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."
Apple first began using the "Retina Display" marketing term with the iPhone 4 last year. Then CEO Steve Jobs touted the 326ppi display as being beyond the capabilities of the human retina when used at a distance of 12 or more inches from the eyes.
In September 2010, the company released a Retina Display iPod touch. Rumors have also swirled that Apple will follow suit with a high-resolution version of the third-generation iPad, doubling the resolution of the tablet to 2048 x 1536.
Of course, Macs that take full advantage of the 4K resolution capabilities built into future generations of Intel's chips would take some time to arrive, as Apple will need to resolve price and production constraints before releasing a Retina Display desktop or notebook. But, 3200 x 2000 desktop wallpapers were discovered in a Developer Preview of Mac OS X Lion earlier this year and appear to telegraph a future resolution bump for Apple's line of Mac computers.
Also of note, Apple added 4K support to its Final Cut Pro video editing program when it released version X in June. However, Final Cut Pro X has caused a controversy, as some users have complained that the application is no longer "pro" software.
The world's largest chipmaker announced the news during a technical session at its Intel Developer Forum in San Francisco last week, as noted by VR-Zone. Ivy Bridge chips will rival competing discrete GPUs by including support for the 4K resolution when they arrive next year.
The company also highlighted a Multi Format Codec (MFX) engine that is capable of playing multiple 4K videos at once. The codec is also capable of handling video processing for 4K QuadHD video, a standard that YouTube began supporting last year.
A set of performance enhancements, with special attention to graphics, should give Ivy Bridge as much as a 60 percent performance boost over the current generation of Sandy Bridge chips, according to Intel.
Intel also revealed last week that Ivy Bridge chips will include support for Apple's OpenCL standard, which should give a performance boost to next-generation MacBook Air and 13-inch MacBook Pro models when they arrive in 2012.
If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."
Apple first began using the "Retina Display" marketing term with the iPhone 4 last year. Then CEO Steve Jobs touted the 326ppi display as being beyond the capabilities of the human retina when used at a distance of 12 or more inches from the eyes.
In September 2010, the company released a Retina Display iPod touch. Rumors have also swirled that Apple will follow suit with a high-resolution version of the third-generation iPad, doubling the resolution of the tablet to 2048 x 1536.
Of course, Macs that take full advantage of the 4K resolution capabilities built into future generations of Intel's chips would take some time to arrive, as Apple will need to resolve price and production constraints before releasing a Retina Display desktop or notebook. But, 3200 x 2000 desktop wallpapers were discovered in a Developer Preview of Mac OS X Lion earlier this year and appear to telegraph a future resolution bump for Apple's line of Mac computers.
Also of note, Apple added 4K support to its Final Cut Pro video editing program when it released version X in June. However, Final Cut Pro X has caused a controversy, as some users have complained that the application is no longer "pro" software.
Comments
2015 can't come soon enough.
Your probably close as doesn't look like 2012 is the year of Retina Macs but maybe.
Your probably close as doesn't look like 2012 is the year of Retina Macs but maybe.
Not a chance. Until OSX has resolution independence, it would do more harm than good. As things are today, certain parts of the OS and software are already becoming microscopic on larger monitors.
Not a chance. Until OSX has resolution independence, it would do more harm than good. As things are today, certain parts of the OS and software are already becoming microscopic on larger monitors.
You can always double the resolution (like on the iPhone). Then that's not a problem...
But its funny how the article keeps talking about 4k video. Like that's the use case! Most movies currently can't even afford to shoot in 4k! This is all about text rendering...
They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.
The first GPUs to pump out 4k won't come from Intel.
2015 can't come soon enough.
I was gonna say, maybe 2013 or so?
I'll be in the market for a new computer then! I'm drooling over the possibilities
MacBook Air (2013/2014?)
1 TB SSD
13" Retina Display (4K)
Low voltage, QuadCore Processor
16 GB RAM
10 hour battery life (?!)
I wonder if this is realistic or just childish embellishing! Computer specs seem to have plateaued in recent years, but there seems to be lots of progress in low-voltage processor, SSDs, and high-resolution displays...
I'm so sick of Intel's BS.
They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.
The first GPUs to pump out 4k won't come from Intel.
Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.
Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....
Intel makes terrific CPU's but they should stop trying to do GPU's.
I'll be in the market for a new computer then! I'm drooling over the possibilities
MacBook Air (2013/2014?)
1 TB SSD
i think 256GB or maximum 512GB
13" Retina Display (4K)
you wish?
Low voltage, QuadCore Processor
or maybe 8 cores?
16 GB RAM
4GB or max 8GB
10 hour battery life (?!)
ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?
I'll be in the market for a new computer then! I'm drooling over the possibilities
MacBook Air (2013/2014?)
1 TB SSD
13" Retina Display (4K)
Low voltage, QuadCore Processor
or maybe 8 cores?
16 GB RAM
10 hour battery life (?!)
Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.
Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....
Intel makes terrific CPU's but they should stop trying to do GPU's.
i think 256GB or maximum 512GB
4k screen: you wish
4GB or max 8GB RAM
ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?
Once monitors reach retina resolution, is there any point going any higher? Perhaps someone will invent an improvement to our eyes and we will be back at square one
Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.
I'm so sick of Intel's BS.
They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.
The first GPUs to pump out 4k won't come from Intel.
Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.
I am guessing HiDPI will available in the subsequent OS X release to Lion.
Functionally, HiDPI should allow lower resolution on Retina screens without loss of viewing quality.
Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.
Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.
Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?
Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.
You can always double the resolution (like on the iPhone). Then that's not a problem...
But its funny how the article keeps talking about 4k video. Like that's the use case! Most movies currently can't even afford to shoot in 4k! This is all about text rendering...