Intel's Ivy Bridge support for 4K resolution could pave way for "Retina" Macs

2

Comments

  • Reply 21 of 54
    mstonemstone Posts: 11,510member
    Quote:
    Originally Posted by IronHeadSlim View Post


    You need to look at RED. It is not only affordable but available now!



    Affordable for anyone with an extra $100K. The camera body starts at 25K but you can't shoot a single frame until you buy 50+ pricey accessories.
  • Reply 22 of 54
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by mdriftmeyer View Post


    I'm so sick of Intel's BS.



    They are leagues behind AMD and Nvidia in GPGPU performance for accelerated OpenGL environments and OpenCL scalability.



    The first GPUs to pump out 4k won't come from Intel.



    Aren't there already GPUs that can handle 4k? 4k has been available for computers for several years now, it was just a pricey proposition when the monitors costed $6,000 and required two dual-link DVI ports.
  • Reply 23 of 54
    Why?



    I mean, why would anyone want to tax the entire process with such outrageously over-the-top specs?



    I just can't imagine, for example, that Hollywood would be at all pleased with having to deliver 4K versions of their movies. Meanwhile Apple, making a big push to promote downloading video over physical media, would have a hard time delivering these massive files to consumers. Even if they could put in place the equipment to send the files out, service providers are already giving consumers grief over how much bandwidth they're using, at least here in Canada.



    So movies are simply not a use for such a display. Books? Even with ordinary resolution on the current iPad the transition from the printed page has kicked into high gear. Besides, going retina-like on the iPad would not require 4K resolutions considering a 10-inch display is likely the biggest a tablet should go. Any bigger and all the advantages to the form factor evaporate.



    Quite simply, just because you could have more resolution does not mean you should.



    By the way, while I don't doubt that the iPad 3 could have a display with higher resolution, a retina display is at best a long shot. There is simply no need for it to go there and hence, why bother. Fact is, as far as the tablet market is concerned, the average consumer's response would be, "You had me at iPad 1."
  • Reply 24 of 54
    I wonder what the "wow factor" of the next iPad is going to be? I don't think the resolution + speed bump from iPad 2 will be enough for me to pick one up, but I'm sure Apple has something that will simply make buying the next iPad irresistable. I really wonder what it will be?
  • Reply 25 of 54
    The original quote was most movies can't afford it. I assume he meant by the large studios that use cameras priced in the $100,000s. RED makes professional equipment but keep your eye on their next camera called Scarlet. The large companies are following along with the 4k standard. Canon will announce something in the beginning of November.



    4k is the new 1080p!



    Quote:
    Originally Posted by mstone View Post


    Affordable for anyone with an extra $100K. The camera body starts at 25K but you can't shoot a single frame until you buy 50+ pricey accessories.



  • Reply 26 of 54
    mcarlingmcarling Posts: 1,106member
    Quote:
    Originally Posted by acslater017 View Post


    I'll be in the market for a new computer then! I'm drooling over the possibilities



    MacBook Air (2013/2014?)

    1 TB SSD

    13" Retina Display (4K)

    Low voltage, QuadCore Processor

    16 GB RAM

    10 hour battery life (?!)



    I wonder if this is realistic or just childish embellishing!



    Given the way the technology is implemented, Apple will double either the 1280x800 or 1440x900 resolution currently used on the 13" products, so we can expect a 13" Retina Display to be either 2560x1600 or 2880x1800. My guess is the former because yields will be better and even 2560x1600 will be stunning.
  • Reply 27 of 54
    Once you go 4k you can't go back. Trust me, your eyes will see a difference. Movies in the theatre will be 4k projected quicker than you think. There is a lot going on. I believe RED was the pioneer to make it the standard. No matter though because it will be the standard and it will happen soon.
  • Reply 28 of 54
    4k and open cl combo is gonna rock. Can't wait!
  • Reply 29 of 54
    jragostajragosta Posts: 10,473member
    Quote:
    Originally Posted by mdriftmeyer View Post


    Did I touch a nerve. Wanna whip out our degrees and discuss the difference between integrated GPUs by AMD and the junk by Intel?



    Sorry, but AMD's APUs graphically now and in the future run circles around anything Intel will ever produce.



    No one ever said that they didn't.



    Intel's iSeries chips are the overwhelming market choice and a lot of those are going into inexpensive systems that do no have a dedicated GPU. Intel is simply improving the iGPU on their chips. Why are you jumping all over them for improving their product?
  • Reply 30 of 54
    reganregan Posts: 474member
    I can see it now....tons of imacs with cracked screens being brought into the Apple store by people with bloody foreheads because the resolution was so good, they tried to climb into their computers. :-)
  • Reply 31 of 54
    new great screen, thanks to making our screen having over 300 ppi and having 4K rez, we have had to raise the price to $5,000.



    We would like you to buy a screen with each computer you buy from us, thank you.



    __________________________________________________ __________________________



    If you cannot tell, i think this is semi-worthless, due to price it would cost to make a screen that large that Resolution.



    though i think its a great thing for the future.



    also, nvidia/amd discrete graphics= 1 (work well)



    intel integrated graphics = 0 (does not work)



    0 times .6 (60%) = 0. (yay?)
  • Reply 32 of 54
    Quote:
    Originally Posted by AppleInsider View Post










    If Apple were to introduce a 4K resolution display with the 16:9 ratio currently used in its Thunderbolt Display, iMac and MacBook Air products, the resulting resolution would be 4096 x 2304. A 27-inch display with 4K resolution would sport a pixel density of 174 pixels per inch. Assuming a working distance of 24 inches and 20/20 vision for the calculations, a 4K 27-inch iMac or Thunderbolt display would count as a "Retina Display."








    By that logic, an old SD TV could "count as a 'Retina Display'" if you sat far enough away.
  • Reply 33 of 54
    Quote:
    Originally Posted by IronHeadSlim View Post


    4k is the new 1080p!



    Don't forget that the 4k bit relates to the horizontal resolution whereas 1080p refers to a vertical resolution ... Technically we have 2k (1920 x 1080) with blu-ray already ...



    Having said that I was at a RED event at Pinewood film studios the other week where we were treated to 4k native RED footage in 2d and 3d on a 4k projector - now that is sharp !!
  • Reply 34 of 54
    wizard69wizard69 Posts: 13,377member
    Really do you expect to actually see better performance from these chips driving 4K displays. An Intel chip trying to do that will fall flat on it's face.



    Quote:
    Originally Posted by jragosta View Post


    Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.







    Intel never claimed that they'd have the first 4K GPUs. Nor did they claim to be ahead of anyone. What they claimed is that the integrated GPU on future chips would have greater performance than the current generation - which is true. You apparently don't understand the different applications for an integrated GPU vs a dedicated one.



    Even their performance claims are misleading. The 60% figure comes from one bench mark, most of the balance of the testing is around 30%. So where does this leave us on 4K displays?



    I'm not dismissing that Intel GPUs are good enough for some, but some can get by with 4 cylinder compacts. The only difference here is that computers have a wider array of uses than economic transportation. Thus coming up short on GPU performance is a wider issue.
  • Reply 35 of 54
    I'll take a 30" monitor that does 4k! Bring it on!
  • Reply 36 of 54
    Quote:
    Originally Posted by Apple ][ View Post


    I'll take a 30" monitor that does 4k! Bring it on!



    I watched a few of the presentations from the last Developers Conference, using my free membership. It appears Apple is emphasizing a 2X type image increase now over the older resolution independence methods that allowed you to set a large variety of ratios. So it would work much like the iPhone, new programs could take advantage of the new tech, old programs would just pixel double everything. I think Apple was having trouble getting some of the major players to adapt there programs for RI. The old RI system works pretty well on most of Apple's programs but not on most third party apps.



    Is this a .X change to Lion when the monitors are ready or is this going to require a full system upgrade, that I do not know. I think the portables will go 2X long before the 27-inch screens.
  • Reply 37 of 54
    jragostajragosta Posts: 10,473member
    Quote:
    Originally Posted by wizard69 View Post


    Really do you expect to actually see better performance from these chips driving 4K displays. An Intel chip trying to do that will fall flat on it's face.



    Yes, I'm sure you know more about the performance of Intel's next generation of chips than they do.



    Quote:
    Originally Posted by wizard69 View Post


    Even their performance claims are misleading. The 60% figure comes from one bench mark, most of the balance of the testing is around 30%. So where does this leave us on 4K displays?



    I'm not dismissing that Intel GPUs are good enough for some, but some can get by with 4 cylinder compacts. The only difference here is that computers have a wider array of uses than economic transportation. Thus coming up short on GPU performance is a wider issue.



    That might be a valid argument - if Intel only sold one type of chip and if Intel chips were never used in systems with dedicated GPUs.



    Intel, OTOH, has differentiated the market and offers some chips with integrated graphics for low end systems and high end chips for more demanding needs. It's just really hard to see how improving their low end chip is a negative - just because it hasn't become a high end chip.
  • Reply 38 of 54
    Quote:
    Originally Posted by jragosta View Post


    Not at all. People are always impressed by specs. Look at the TVs claiming 120 Hz or even higher refresh rates - irrelevant for viewers.



    True, many specs are used for marketing purposes, and are treated as more = better.



    However, specifications are ultimately an engineering issue, where optimum = better.



    Televisions with 120 Hz have a very specific purpose, which is beneficial to the viewer. Video is commonly available in formats that support 30, 60, and 24 frames per second. A common 60 Hz display has to play games (like 3:2 pulldown) in order to play 24fps video. While this is a very effective technique, it isn't quite correct. Using 120 Hz (24*5) allows 24fps video to be played as it was intended.



    Most people are very accustomed to watching movies with 3:2 pulldown (used since forever ago to transfer movies to broadcast TV, VHS, and DVD), and either don't notice or don't care. It's a feature that mostly videophiles care about. But, when watching a 24fps Blu-Ray movie, it will make it appear slightly more "film-like" for a more authentic theater experience at home.



    There are 240Hz televisions. While I am not aware of any downsides (other than perhaps cost), I am also not aware of any benefits.



    It's good to be skeptical about the specs in marketing, but they aren't always bogus, either. Research and evaluate for yourself.



    On-Topic: As someone who spends 8 hours staring at PC screens, Retina-class displays can't come soon enough... Sadly, I know I won't see one at work for many years...
  • Reply 39 of 54
    I'd hope to see the implementation of a popular HD disc based format before seeing super high res screens

    Mega res vids take up mega room. I don't see why Apple are so against Blu-ray.



    Just allow us to connect a USB (or thunderbolt) BD Rom drive and buy a bit of software. All you have to do is update OSX to allow it to happen. It's not like the Macs don't have the processing power.



    Come to think of it, why is there no Blu-ray player software from 3rd parties?
  • Reply 40 of 54
    mcarlingmcarling Posts: 1,106member
    Quote:
    Originally Posted by Evilution View Post


    I'd hope to see the implementation of a popular HD disc based format before seeing super high res screens



    Don't hold your breath.



    Quote:
    Originally Posted by Evilution View Post


    Mega res vids take up mega room. I don't see why Apple are so against Blu-ray.



    It's simple. Apple make a lot of money from iTunes and they will make far more money from iCloud.
Sign In or Register to comment.