Intel's Ivy Bridge support for 4K resolution could pave way for "Retina" Macs

Posted:
in Future Apple Hardware edited January 2014
Intel quietly revealed last week that its next-generation Ivy Bridge processors will support the 4K display resolution, with up to 4096 x 4096 pixels per monitor, potentially paving the way for Apple to introduce high-resolution "Retina Display" Macs.



The world's largest chipmaker announced the news during a technical session at its Intel Developer Forum in San Francisco last week, as noted by VR-Zone. Ivy Bridge chips will rival competing discrete GPUs by including support for the 4K resolution when they arrive next year.



The company also highlighted a Multi Format Codec (MFX) engine that is capable of playing multiple 4K videos at once. The codec is also capable of handling video processing for 4K QuadHD video, a standard that YouTube began supporting last year.



A set of performance enhancements, with special attention to graphics, should give Ivy Bridge as much as a 60 percent performance boost over the current generation of Sandy Bridge chips, according to Intel.



Intel also revealed last week that Ivy Bridge chips will include support for Apple's OpenCL standard, which should give a performance boost to next-generation MacBook Air and 13-inch MacBook Pro models when they arrive in 2012.











If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."







Apple first began using the "Retina Display" marketing term with the iPhone 4 last year. Then CEO Steve Jobs touted the 326ppi display as being beyond the capabilities of the human retina when used at a distance of 12 or more inches from the eyes.



In September 2010, the company released a Retina Display iPod touch. Rumors have also swirled that Apple will follow suit with a high-resolution version of the third-generation iPad, doubling the resolution of the tablet to 2048 x 1536.



Of course, Macs that take full advantage of the 4K resolution capabilities built into future generations of Intel's chips would take some time to arrive, as Apple will need to resolve price and production constraints before releasing a Retina Display desktop or notebook. But, 3200 x 2000 desktop wallpapers were discovered in a Developer Preview of Mac OS X Lion earlier this year and appear to telegraph a future resolution bump for Apple's line of Mac computers.



Also of note, Apple added 4K support to its Final Cut Pro video editing program when it released version X in June. However, Final Cut Pro X has caused a controversy, as some users have complained that the application is no longer "pro" software.
«13

Comments

  • Reply 1 of 54
    I am looking forward to the day when my 27" Cinema ("ThunderBolt") or whatever display is as incredibly sharp as the iPhone 4. For a 3.5" screen to have 326ppi, along with an LED backed IPS panel was something to behold. The iPad 3 will arguably be next to have an extremely high ppi rating at the almost 10" mark, but imagine "Retina" like numbers on screens over 20" or 25"...
  • Reply 2 of 54
    Yes oh yes, 4k displays with resolution independence in the OS -- it is coming!!! my 16-Megapixel photos from my D7000 would look great on a 4K display! WOW
  • Reply 3 of 54
    2015 can't come soon enough.
  • Reply 4 of 54
    hmmhmm Posts: 3,405member
    Resolution is just one factor but i'd really like to see this. Of course gpu technology has a bit of a way to go, and I wish Apple would consider full displayport connectors rather than this mini displayport crap (better bandwith, tighter connection). Thunderbolt is fully compatible with displayport protocols anyway.
  • Reply 5 of 54
    Quote:
    Originally Posted by september11th View Post


    2015 can't come soon enough.



    Your probably close as doesn't look like 2012 is the year of Retina Macs but maybe.
  • Reply 6 of 54
    2oh12oh1 Posts: 503member
    Quote:
    Originally Posted by BUSHMAN4 View Post


    Your probably close as doesn't look like 2012 is the year of Retina Macs but maybe.



    Not a chance. Until OSX has resolution independence, it would do more harm than good. As things are today, certain parts of the OS and software are already becoming microscopic on larger monitors.
  • Reply 7 of 54
    Quote:
    Originally Posted by 2oh1 View Post


    Not a chance. Until OSX has resolution independence, it would do more harm than good. As things are today, certain parts of the OS and software are already becoming microscopic on larger monitors.



    You can always double the resolution (like on the iPhone). Then that's not a problem...



    But its funny how the article keeps talking about 4k video. Like that's the use case! Most movies currently can't even afford to shoot in 4k! This is all about text rendering...
  • Reply 8 of 54
    I'm so sick of Intel's BS.



    They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.



    The first GPUs to pump out 4k won't come from Intel.
  • Reply 9 of 54
    Quote:
    Originally Posted by september11th View Post


    2015 can't come soon enough.



    I was gonna say, maybe 2013 or so?



    I'll be in the market for a new computer then! I'm drooling over the possibilities



    MacBook Air (2013/2014?)

    1 TB SSD

    13" Retina Display (4K)

    Low voltage, QuadCore Processor

    16 GB RAM

    10 hour battery life (?!)



    I wonder if this is realistic or just childish embellishing! Computer specs seem to have plateaued in recent years, but there seems to be lots of progress in low-voltage processor, SSDs, and high-resolution displays...
  • Reply 10 of 54
    poochpooch Posts: 768member
    i'm "Imaging a wall of 4K videos!" right now. yeesh.
  • Reply 11 of 54
    asciiascii Posts: 5,936member
    Once monitors reach retina resolution, is there any point going any higher? Perhaps someone will invent an improvement to our eyes and we will be back at square one
  • Reply 12 of 54
    Quote:
    Originally Posted by mdriftmeyer View Post


    I'm so sick of Intel's BS.



    They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.



    The first GPUs to pump out 4k won't come from Intel.



    Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.



    Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....



    Intel makes terrific CPU's but they should stop trying to do GPU's.
  • Reply 13 of 54
    [QUOTE=acslater017;1945510]I was gonna say, maybe 2013 or so?



    I'll be in the market for a new computer then! I'm drooling over the possibilities



    MacBook Air (2013/2014?)

    1 TB SSD

    i think 256GB or maximum 512GB

    13" Retina Display (4K)

    you wish?

    Low voltage, QuadCore Processor

    or maybe 8 cores?

    16 GB RAM

    4GB or max 8GB

    10 hour battery life (?!)

    ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?
  • Reply 14 of 54
    [QUOTE=cutykamu;1945534][QUOTE=acslater017;1945510]I was gonna say, maybe 2013 or so?



    I'll be in the market for a new computer then! I'm drooling over the possibilities



    MacBook Air (2013/2014?)

    1 TB SSD

    13" Retina Display (4K)

    Low voltage, QuadCore Processor

    or maybe 8 cores?

    16 GB RAM

    10 hour battery life (?!)



    Quote:
    Originally Posted by d-range View Post


    Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.



    Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....



    Intel makes terrific CPU's but they should stop trying to do GPU's.



    i think 256GB or maximum 512GB

    4k screen: you wish

    4GB or max 8GB RAM

    ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?
  • Reply 15 of 54
    jragostajragosta Posts: 10,473member
    Quote:
    Originally Posted by ascii View Post


    Once monitors reach retina resolution, is there any point going any higher? Perhaps someone will invent an improvement to our eyes and we will be back at square one



    Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.



    Quote:
    Originally Posted by mdriftmeyer View Post


    I'm so sick of Intel's BS.



    They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.



    The first GPUs to pump out 4k won't come from Intel.



    Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.
  • Reply 16 of 54
    bertpbertp Posts: 274member
    I can't speak to the hardware issues. But I have tried out HiDPI, which is available through Xcode 4.1. It seems that Apple is pretty far along with the development of this feature for OS X. It worked well for me, although people like John Siracusa say it still retains flaws.



    I am guessing HiDPI will available in the subsequent OS X release to Lion.



    Functionally, HiDPI should allow lower resolution on Retina screens without loss of viewing quality.
  • Reply 17 of 54
    irelandireland Posts: 17,798member
    Stop with the quietly already.
  • Reply 18 of 54
    Quote:
    Originally Posted by jragosta View Post


    Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.







    Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.



    Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?



    Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.
  • Reply 19 of 54
    tipootipoo Posts: 1,141member
    That would be awesome, and Apple does have a history of putting well-above average cost displays in their products, but even with a high end GPU for anything the IGP couldn't handle, you'd be running lots of games in interlaced mode, and down-sampling full screen 1080p, etc. Plus, OSX still doesn't have full resolution independence. No, just because the IGP supports it, doesn't mean Apple will follow suit with a display. AMD's Eyefinity can support huge resolutions too, after all.
  • Reply 20 of 54
    You need to look at RED. It is not only affordable but available now!



    Quote:
    Originally Posted by foobar View Post


    You can always double the resolution (like on the iPhone). Then that's not a problem...



    But its funny how the article keeps talking about 4k video. Like that's the use case! Most movies currently can't even afford to shoot in 4k! This is all about text rendering...



Sign In or Register to comment.