or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel's Ivy Bridge support for 4K resolution could pave way for "Retina" Macs
New Posts  All Forums:Forum Nav:

Intel's Ivy Bridge support for 4K resolution could pave way for "Retina" Macs

post #1 of 55
Thread Starter 
Intel quietly revealed last week that its next-generation Ivy Bridge processors will support the 4K display resolution, with up to 4096 x 4096 pixels per monitor, potentially paving the way for Apple to introduce high-resolution "Retina Display" Macs.

The world's largest chipmaker announced the news during a technical session at its Intel Developer Forum in San Francisco last week, as noted by VR-Zone. Ivy Bridge chips will rival competing discrete GPUs by including support for the 4K resolution when they arrive next year.

The company also highlighted a Multi Format Codec (MFX) engine that is capable of playing multiple 4K videos at once. The codec is also capable of handling video processing for 4K QuadHD video, a standard that YouTube began supporting last year.

A set of performance enhancements, with special attention to graphics, should give Ivy Bridge as much as a 60 percent performance boost over the current generation of Sandy Bridge chips, according to Intel.

Intel also revealed last week that Ivy Bridge chips will include support for Apple's OpenCL standard, which should give a performance boost to next-generation MacBook Air and 13-inch MacBook Pro models when they arrive in 2012.





If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."



Apple first began using the "Retina Display" marketing term with the iPhone 4 last year. Then CEO Steve Jobs touted the 326ppi display as being beyond the capabilities of the human retina when used at a distance of 12 or more inches from the eyes.

In September 2010, the company released a Retina Display iPod touch. Rumors have also swirled that Apple will follow suit with a high-resolution version of the third-generation iPad, doubling the resolution of the tablet to 2048 x 1536.

Of course, Macs that take full advantage of the 4K resolution capabilities built into future generations of Intel's chips would take some time to arrive, as Apple will need to resolve price and production constraints before releasing a Retina Display desktop or notebook. But, 3200 x 2000 desktop wallpapers were discovered in a Developer Preview of Mac OS X Lion earlier this year and appear to telegraph a future resolution bump for Apple's line of Mac computers.

Also of note, Apple added 4K support to its Final Cut Pro video editing program when it released version X in June. However, Final Cut Pro X has caused a controversy, as some users have complained that the application is no longer "pro" software.
post #2 of 55
I am looking forward to the day when my 27" Cinema ("ThunderBolt") or whatever display is as incredibly sharp as the iPhone 4. For a 3.5" screen to have 326ppi, along with an LED backed IPS panel was something to behold. The iPad 3 will arguably be next to have an extremely high ppi rating at the almost 10" mark, but imagine "Retina" like numbers on screens over 20" or 25"...

(Mid-2012) 15.4" MacBook Pro w/ IPS Retina Display | Quad Core i7-3720QM 2.6GHz / 3.6GHz Max. Turbo | 16GB DDR3-1600MHz RAM | 256GB Samsung 830 SSD-based NAND Flash ETA 9/5

Reply

(Mid-2012) 15.4" MacBook Pro w/ IPS Retina Display | Quad Core i7-3720QM 2.6GHz / 3.6GHz Max. Turbo | 16GB DDR3-1600MHz RAM | 256GB Samsung 830 SSD-based NAND Flash ETA 9/5

Reply
post #3 of 55
Yes oh yes, 4k displays with resolution independence in the OS -- it is coming!!! my 16-Megapixel photos from my D7000 would look great on a 4K display! WOW
post #4 of 55
2015 can't come soon enough.
post #5 of 55
Resolution is just one factor but i'd really like to see this. Of course gpu technology has a bit of a way to go, and I wish Apple would consider full displayport connectors rather than this mini displayport crap (better bandwith, tighter connection). Thunderbolt is fully compatible with displayport protocols anyway.
post #6 of 55
Quote:
Originally Posted by september11th View Post

2015 can't come soon enough.

Your probably close as doesn't look like 2012 is the year of Retina Macs but maybe.
post #7 of 55
Quote:
Originally Posted by BUSHMAN4 View Post

Your probably close as doesn't look like 2012 is the year of Retina Macs but maybe.

Not a chance. Until OSX has resolution independence, it would do more harm than good. As things are today, certain parts of the OS and software are already becoming microscopic on larger monitors.
post #8 of 55
Quote:
Originally Posted by 2oh1 View Post

Not a chance. Until OSX has resolution independence, it would do more harm than good. As things are today, certain parts of the OS and software are already becoming microscopic on larger monitors.

You can always double the resolution (like on the iPhone). Then that's not a problem...

But its funny how the article keeps talking about 4k video. Like that's the use case! Most movies currently can't even afford to shoot in 4k! This is all about text rendering...
post #9 of 55
I'm so sick of Intel's BS.

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.
post #10 of 55
Quote:
Originally Posted by september11th View Post

2015 can't come soon enough.

I was gonna say, maybe 2013 or so?

I'll be in the market for a new computer then! I'm drooling over the possibilities

MacBook Air (2013/2014?)
1 TB SSD
13" Retina Display (4K)
Low voltage, QuadCore Processor
16 GB RAM
10 hour battery life (?!)

I wonder if this is realistic or just childish embellishing! Computer specs seem to have plateaued in recent years, but there seems to be lots of progress in low-voltage processor, SSDs, and high-resolution displays...
post #11 of 55
i'm "Imaging a wall of 4K videos!" right now. yeesh.
"Personally, I would like nothing more than to thoroughly proof each and every word of my articles before posting. But I can't."

appleinsider's mike campbell, august 15, 2013
Reply
"Personally, I would like nothing more than to thoroughly proof each and every word of my articles before posting. But I can't."

appleinsider's mike campbell, august 15, 2013
Reply
post #12 of 55
Once monitors reach retina resolution, is there any point going any higher? Perhaps someone will invent an improvement to our eyes and we will be back at square one
post #13 of 55
Quote:
Originally Posted by mdriftmeyer View Post

I'm so sick of Intel's BS.

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.

Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.

Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....

Intel makes terrific CPU's but they should stop trying to do GPU's.
post #14 of 55
[QUOTE=acslater017;1945510]I was gonna say, maybe 2013 or so?

I'll be in the market for a new computer then! I'm drooling over the possibilities

MacBook Air (2013/2014?)
1 TB SSD
i think 256GB or maximum 512GB
13" Retina Display (4K)
you wish
Low voltage, QuadCore Processor
or maybe 8 cores?
16 GB RAM
4GB or max 8GB
10 hour battery life (?!)
ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?

my way or the highway...

Macbook Pro i7 13" with intel SSD 320 series and 8GB RAM, iPhone 5, iPad 3 (Retina)

Reply

my way or the highway...

Macbook Pro i7 13" with intel SSD 320 series and 8GB RAM, iPhone 5, iPad 3 (Retina)

Reply
post #15 of 55
[QUOTE=cutykamu;1945534][QUOTE=acslater017;1945510]I was gonna say, maybe 2013 or so?

I'll be in the market for a new computer then! I'm drooling over the possibilities

MacBook Air (2013/2014?)
1 TB SSD
13" Retina Display (4K)
Low voltage, QuadCore Processor
or maybe 8 cores?
16 GB RAM
10 hour battery life (?!)

Quote:
Originally Posted by d-range View Post

Intel is to GPU's like Nvidia is to ARM chips: a whole lot of hot air, big promises long before the product ships, but consistently under-delivering compared to the competition.

Most likely Intel has their Ivy Bridge GPU's in the labs and they "rival discrete GPUs" from the current generation, ie: about as fast as AMD Fusion. I predict that by the time they get released to the market they'll still be a generation behind in performance....

Intel makes terrific CPU's but they should stop trying to do GPU's.

i think 256GB or maximum 512GB
4k screen: you wish
4GB or max 8GB RAM
ins't intel is talking about 24 hours battery life in Haswell architecture which is due in 2013?

my way or the highway...

Macbook Pro i7 13" with intel SSD 320 series and 8GB RAM, iPhone 5, iPad 3 (Retina)

Reply

my way or the highway...

Macbook Pro i7 13" with intel SSD 320 series and 8GB RAM, iPhone 5, iPad 3 (Retina)

Reply
post #16 of 55
Quote:
Originally Posted by ascii View Post

Once monitors reach retina resolution, is there any point going any higher? Perhaps someone will invent an improvement to our eyes and we will be back at square one

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.

Quote:
Originally Posted by mdriftmeyer View Post

I'm so sick of Intel's BS.

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.

Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #17 of 55
I can't speak to the hardware issues. But I have tried out HiDPI, which is available through Xcode 4.1. It seems that Apple is pretty far along with the development of this feature for OS X. It worked well for me, although people like John Siracusa say it still retains flaws.

I am guessing HiDPI will available in the subsequent OS X release to Lion.

Functionally, HiDPI should allow lower resolution on Retina screens without loss of viewing quality.

Nullis in verba -- "on the word of no one"

 

 

 

Reply

Nullis in verba -- "on the word of no one"

 

 

 

Reply
post #18 of 55
Stop with the quietly already.
Citing unnamed sources with limited but direct knowledge of the rumoured device - Comedy Insider (Feb 2014)
Reply
Citing unnamed sources with limited but direct knowledge of the rumoured device - Comedy Insider (Feb 2014)
Reply
post #19 of 55
Quote:
Originally Posted by jragosta View Post

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.



Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.

Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?

Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.
post #20 of 55
That would be awesome, and Apple does have a history of putting well-above average cost displays in their products, but even with a high end GPU for anything the IGP couldn't handle, you'd be running lots of games in interlaced mode, and down-sampling full screen 1080p, etc. Plus, OSX still doesn't have full resolution independence. No, just because the IGP supports it, doesn't mean Apple will follow suit with a display. AMD's Eyefinity can support huge resolutions too, after all.
post #21 of 55
You need to look at RED. It is not only affordable but available now!

Quote:
Originally Posted by foobar View Post

You can always double the resolution (like on the iPhone). Then that's not a problem...

But its funny how the article keeps talking about 4k video. Like that's the use case! Most movies currently can't even afford to shoot in 4k! This is all about text rendering...
post #22 of 55
Quote:
Originally Posted by IronHeadSlim View Post

You need to look at RED. It is not only affordable but available now!

Affordable for anyone with an extra $100K. The camera body starts at 25K but you can't shoot a single frame until you buy 50+ pricey accessories.

Life is too short to drink bad coffee.

Reply

Life is too short to drink bad coffee.

Reply
post #23 of 55
Quote:
Originally Posted by mdriftmeyer View Post

I'm so sick of Intel's BS.

They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.

The first GPUs to pump out 4k won't come from Intel.

Aren't there already GPUs that can handle 4k? 4k has been available for computers for several years now, it was just a pricey proposition when the monitors costed $6,000 and required two dual-link DVI ports.
post #24 of 55
Why?

I mean, why would anyone want to tax the entire process with such outrageously over-the-top specs?

I just can't imagine, for example, that Hollywood would be at all pleased with having to deliver 4K versions of their movies. Meanwhile Apple, making a big push to promote downloading video over physical media, would have a hard time delivering these massive files to consumers. Even if they could put in place the equipment to send the files out, service providers are already giving consumers grief over how much bandwidth they're using, at least here in Canada.

So movies are simply not a use for such a display. Books? Even with ordinary resolution on the current iPad the transition from the printed page has kicked into high gear. Besides, going retina-like on the iPad would not require 4K resolutions considering a 10-inch display is likely the biggest a tablet should go. Any bigger and all the advantages to the form factor evaporate.

Quite simply, just because you could have more resolution does not mean you should.

By the way, while I don't doubt that the iPad 3 could have a display with higher resolution, a retina display is at best a long shot. There is simply no need for it to go there and hence, why bother. Fact is, as far as the tablet market is concerned, the average consumer's response would be, "You had me at iPad 1."
post #25 of 55
I wonder what the "wow factor" of the next iPad is going to be? I don't think the resolution + speed bump from iPad 2 will be enough for me to pick one up, but I'm sure Apple has something that will simply make buying the next iPad irresistable. I really wonder what it will be?
post #26 of 55
The original quote was most movies can't afford it. I assume he meant by the large studios that use cameras priced in the $100,000s. RED makes professional equipment but keep your eye on their next camera called Scarlet. The large companies are following along with the 4k standard. Canon will announce something in the beginning of November.

4k is the new 1080p!

Quote:
Originally Posted by mstone View Post

Affordable for anyone with an extra $100K. The camera body starts at 25K but you can't shoot a single frame until you buy 50+ pricey accessories.
post #27 of 55
Quote:
Originally Posted by acslater017 View Post

I'll be in the market for a new computer then! I'm drooling over the possibilities

MacBook Air (2013/2014?)
1 TB SSD
13" Retina Display (4K)
Low voltage, QuadCore Processor
16 GB RAM
10 hour battery life (?!)

I wonder if this is realistic or just childish embellishing!

Given the way the technology is implemented, Apple will double either the 1280x800 or 1440x900 resolution currently used on the 13" products, so we can expect a 13" Retina Display to be either 2560x1600 or 2880x1800. My guess is the former because yields will be better and even 2560x1600 will be stunning.
Mac user since August 1983.
Reply
Mac user since August 1983.
Reply
post #28 of 55
Once you go 4k you can't go back. Trust me, your eyes will see a difference. Movies in the theatre will be 4k projected quicker than you think. There is a lot going on. I believe RED was the pioneer to make it the standard. No matter though because it will be the standard and it will happen soon.
post #29 of 55
4k and open cl combo is gonna rock. Can't wait!
post #30 of 55
Quote:
Originally Posted by mdriftmeyer View Post

Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?

Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.

No one ever said that they didn't.

Intel's iSeries chips are the overwhelming market choice and a lot of those are going into inexpensive systems that do no have a dedicated GPU. Intel is simply improving the iGPU on their chips. Why are you jumping all over them for improving their product?
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #31 of 55
I can see it now....tons of imacs with cracked screens being brought into the Apple store by people with bloody foreheads because the resolution was so good, they tried to climb into their computers. :-)
post #32 of 55
new great screen, thanks to making our screen having over 300 ppi and having 4K rez, we have had to raise the price to $5,000.

We would like you to buy a screen with each computer you buy from us, thank you.

__________________________________________________ __________________________

If you cannot tell, i think this is semi-worthless, due to price it would cost to make a screen that large that Resolution.

though i think its a great thing for the future.

also, nvidia/amd discrete graphics= 1 (work well)

intel integrated graphics = 0 (does not work)

0 times .6 (60%) = 0. (yay?)

PC means personal computer.  

i have processing issues, mostly trying to get my ideas into speech and text.

if i say something confusing please tell me!

Reply

PC means personal computer.  

i have processing issues, mostly trying to get my ideas into speech and text.

if i say something confusing please tell me!

Reply
post #33 of 55
Quote:
Originally Posted by AppleInsider View Post





If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."



By that logic, an old SD TV could "count as a 'Retina Display'" if you sat far enough away.
post #34 of 55
Quote:
Originally Posted by IronHeadSlim View Post

4k is the new 1080p!

Don't forget that the 4k bit relates to the horizontal resolution whereas 1080p refers to a vertical resolution ... Technically we have 2k (1920 x 1080) with blu-ray already ...

Having said that I was at a RED event at Pinewood film studios the other week where we were treated to 4k native RED footage in 2d and 3d on a 4k projector - now that is sharp !!
post #35 of 55
Really do you expect to actually see better performance from these chips driving 4K displays. An Intel chip trying to do that will fall flat on it's face.

Quote:
Originally Posted by jragosta View Post

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.



Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.

Even their performance claims are misleading. The 60% figure comes from one bench mark, most of the balance of the testing is around 30%. So where does this leave us on 4K displays?

I'm not dismissing that Intel GPUs are good enough for some, but some can get by with 4 cylinder compacts. The only difference here is that computers have a wider array of uses than economic transportation. Thus coming up short on GPU performance is a wider issue.
post #36 of 55
I'll take a 30" monitor that does 4k! Bring it on!
post #37 of 55
Quote:
Originally Posted by Apple ][ View Post

I'll take a 30" monitor that does 4k! Bring it on!

I watched a few of the presentations from the last Developers Conference, using my free membership. It appears Apple is emphasizing a 2X type image increase now over the older resolution independence methods that allowed you to set a large variety of ratios. So it would work much like the iPhone, new programs could take advantage of the new tech, old programs would just pixel double everything. I think Apple was having trouble getting some of the major players to adapt there programs for RI. The old RI system works pretty well on most of Apple's programs but not on most third party apps.

Is this a .X change to Lion when the monitors are ready or is this going to require a full system upgrade, that I do not know. I think the portables will go 2X long before the 27-inch screens.
post #38 of 55
Quote:
Originally Posted by wizard69 View Post

Really do you expect to actually see better performance from these chips driving 4K displays. An Intel chip trying to do that will fall flat on it's face.

Yes, I'm sure you know more about the performance of Intel's next generation of chips than they do.

Quote:
Originally Posted by wizard69 View Post

Even their performance claims are misleading. The 60% figure comes from one bench mark, most of the balance of the testing is around 30%. So where does this leave us on 4K displays?

I'm not dismissing that Intel GPUs are good enough for some, but some can get by with 4 cylinder compacts. The only difference here is that computers have a wider array of uses than economic transportation. Thus coming up short on GPU performance is a wider issue.

That might be a valid argument - if Intel only sold one type of chip and if Intel chips were never used in systems with dedicated GPUs.

Intel, OTOH, has differentiated the market and offers some chips with integrated graphics for low end systems and high end chips for more demanding needs. It's just really hard to see how improving their low end chip is a negative - just because it hasn't become a high end chip.
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
"I'm way over my head when it comes to technical issues like this"
Gatorguy 5/31/13
Reply
post #39 of 55
Quote:
Originally Posted by jragosta View Post

Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.

True, many specs are used for marketing purposes, and are treated as more = better.

However, specifications are ultimately an engineering issue, where optimum = better.

Televisions with 120 Hz have a very specific purpose, which is beneficial to the viewer. Video is commonly available in formats that support 30, 60, and 24 frames per second. A common 60 Hz display has to play games (like 3:2 pulldown) in order to play 24fps video. While this is a very effective technique, it isn't quite correct. Using 120 Hz (24*5) allows 24fps video to be played as it was intended.

Most people are very accustomed to watching movies with 3:2 pulldown (used since forever ago to transfer movies to broadcast TV, VHS, and DVD), and either don't notice or don't care. It's a feature that mostly videophiles care about. But, when watching a 24fps Blu-Ray movie, it will make it appear slightly more "film-like" for a more authentic theater experience at home.

There are 240Hz televisions. While I am not aware of any downsides (other than perhaps cost), I am also not aware of any benefits.

It's good to be skeptical about the specs in marketing, but they aren't always bogus, either. Research and evaluate for yourself.

On-Topic: As someone who spends 8 hours staring at PC screens, Retina-class displays can't come soon enough... Sadly, I know I won't see one at work for many years...
post #40 of 55
I'd hope to see the implementation of a popular HD disc based format before seeing super high res screens
Mega res vids take up mega room. I don't see why Apple are so against Blu-ray.

Just allow us to connect a USB (or thunderbolt) BD Rom drive and buy a bit of software. All you have to do is update OSX to allow it to happen. It's not like the Macs don't have the processing power.

Come to think of it, why is there no Blu-ray player software from 3rd parties?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel's Ivy Bridge support for 4K resolution could pave way for "Retina" Macs