Rumor: Apple could launch new 27-inch iMac with 5K Retina display this fall

1246

Comments

  • Reply 61 of 104
    melgrossmelgross Posts: 33,510member
    Sure does. It proves that Apple couldn’t care less what panels are available when they want to make a product. They’ll make whatever they want.

    The 17” PowerBook, then. Or the retina MacBook Pro. The 30” Cinema Display. Whatever you like; when they set their mind on a panel they’ll get it.

    Except that all of those panels were available. Apple always uses panels that are available. If they used a custom 5k panel for this, the monitor would cost above $10 thousand, easily, because they wouldn't sell that many of them. It's why Dell uses two panels, and the monitor still costa $2,500. And Dell isn't known for making expensive monitors, just the opposite. So how much would an Apple version cost? $3,500? More?
  • Reply 62 of 104
    Quote:

    Originally Posted by digitalclips View Post





    Why not get a new Mac Pro?

    It has more GPU power than I need. I really just need one GPU to drive the display. Nothing I do takes advantage of GPGPU stuff. 

  • Reply 63 of 104
    melgrossmelgross Posts: 33,510member
    boltsfan17 wrote: »
    I'm not joking. I want a higher resolution monitor. It doesn't necessarily need to be 5k though. 

    Sure, real 4k (4096) would be great. So far though, 4k monitors and many Tv's are really 38xx. Apple might be waiting for the panels to drop in price, as that's something they often do, or are waiting for a proper CPU/GPU upgrade to drive them properly. No point in having a high Rez screen that runs slow.
  • Reply 64 of 104
    melgrossmelgross Posts: 33,510member
    ibeam wrote: »
    That would make sense. Up until recently you could not see realtime filters applied so rendering and watching on a separate TV was the way it was done. With realtime filters and effects it would definitely be nice to see the full resolution on the editing screen, especially in After Effects or when doing color and luminance adjustments.

    Except what does this resolution match up to? 4k is the new standard. 5k is not. There is experimental work being done on 6k, which may not see the light of day, and 8 k, the purpose of which is debatable, even for full size theaters. I personally prefer 16:10 monitors, but that's just me. But I would rather use another, smaller monitor for the editor than have a 5k for both.
  • Reply 65 of 104
    melgrossmelgross Posts: 33,510member
    Display be built in

    Are you talking about an iMac? You know that the Dell monitor costs $2,500, and that's not as much as an Apple monitor would cost. It's also the cost of a high end iMac. How much would people pay for a 5k iMac? $3,500? $4,000? $4,500?

    How about you?
  • Reply 66 of 104
    I'd much rather see Apple move to the head of the display tech industry instead of always following so far behind. Ultra-wide, IMO, is much more impressive and versatile than 4k. Look at the LG 34UM95, now THAT is an impressive monitor. If Apple could bring us a 4K Ultra-wide??? Now that would make Apple a leader!
  • Reply 67 of 104
    Display Port 1.3 is to be finalized before Fall and support ups to 8k display.
  • Reply 68 of 104
    Originally Posted by Ted13 View Post

    If they come out with a 5K Apple Cinema Display, then they have to start worrying about Thunderbolt bandwidth


     

    Question: Thunderbolt 2 doesn’t? I don’t remember offhand about Thunderbolt 3, but then again Intel hasn’t published much on that.

     

    Originally Posted by melgross View Post

    Except that all of those panels were available.

     

    How? The 30” Cinema Display was the first of its kind. The retina MacBook Pro was the first of its kind. I don’t remember about the panel in the first PowerBook, but the 1920x1200 17” wasn’t used anywhere.

     

    So how much would an Apple version cost? $3,500? More?


     

    The 30” Cinema Display debuted at $3,299, remember. Your guess sounds about right.

     

    Originally Posted by mdriftmeyer View Post

    Display Port 1.3 is to be finalized before Fall and support ups to 8k display.



    See, there you go. Problem solved.

  • Reply 69 of 104
    Quote:

    Originally Posted by Apple ][ View Post

     

     

    Is yours the curved one?

     

    Besides the extra wide space for additional workspace, it must be nice to see a real movie on a 21:9 display.

     

    16:9 is basically TV format, and virtually all real movies ever made are 21:9 or even wider.


    No, it's dead flat but it doesn't seem to be an issue.

    21:9 TVs have died a death so although it's a bit of a gimmick and I still haven't watched a film that fills up the whole screen, the ratio makes a lot more sense on a computer as you are essentially putting 2x 20" 4:3 monitors on the same screen so it works well for multiple windows.

  • Reply 70 of 104
    marvfoxmarvfox Posts: 2,275member

    Forget the Mac Mini that is a dead item I think now.

  • Reply 71 of 104
    Quote:

    Originally Posted by melgross View Post  



    That's not a good reason. The monitor is too wide, too many pixels. I don't know of anyone who would put screen elements on the sides of the video. I've done a lot of commercial video editing over the years. I would rather go to another, smaller monitor for that purpose.

     

    I understand what you're saying, but I don't understand WHY you would want to work that way.

     

    Why would you want to have tools on one screen and video on another? There's less visual hunting involved if they're both on the same screen.

  • Reply 72 of 104
    asciiascii Posts: 5,936member
    Quote:

    Originally Posted by marvfox View Post

     

    Forget the Mac Mini that is a dead item I think now.


    I think it's just waiting for the Broadwell chips. There was a leak on the Apple website a few months ago when information about a mid-2014 Mac mini appeared and then disappeared. They must have been planning to release it but the Intel delays stopped them.

  • Reply 73 of 104
    Quote:

    Originally Posted by GadgetCanadaV2 View Post

     
    Quote:
    Originally Posted by Tallest Skil View Post

     

     

    So just keep using the display after the hardware is obsolete.


    That's a good point. I keep forgetting you can do that with the iMac.


     

    Forget ye not.

  • Reply 74 of 104
    Quote:

    Originally Posted by melgross View Post

     
    Quote:

    Originally Posted by brlawyer View Post



    Now that's news worth following; not that butt-**** ugly AppleWatch.


     

    Quote:

    Originally Posted by brlawyer View Post



    Worry not; now that SJ is gone, "clumsy" is the new black for Apple.




    Disagree totally on both points.

     

    Okay; "cackhanded" is the new white.

  • Reply 75 of 104

    Why doesn't the Apple Watch have 5K?

     

    Get to it, Apple.

  • Reply 76 of 104
    MarvinMarvin Posts: 15,322moderator
    ascii wrote: »
    I think it's just waiting for the Broadwell chips. There was a leak on the Apple website a few months ago when information about a mid-2014 Mac mini appeared and then disappeared. They must have been planning to release it but the Intel delays stopped them.

    It mentioned a 27" iMac too though. I'm not sure when the cutoff point is for describing a model as "mid". October seems to be "late" so it does seem as though there was an unexpected delay but it might not have been Intel. They went with Haswell refresh for the Air and MBP. I think the Retina display in the iMac will cause problems as it's the highest resolution display they've ever produced and has to be laminated to the glass with an anti-glare coating. Getting hardware and software to support higher than 4K at 60Hz could come with some difficulty. The iMac doesn't have Thunderbolt 2 yet so that will have to be added in order to support Thunderbolt 2 displays with the same resolution.

    Dell has launched some 5K displays due by Christmas (5120×2880):

    http://www.extremetech.com/computing/189342-dell-unveils-5k-desktop-monitor-with-almost-2x-the-pixels-of-your-puny-4k-display

    What that article says is that some display manufacturers have been putting two separate panels together like tiles and using dual Displayport 1.2 inputs to drive them. It's possible for Apple to use two panel tiles together and drive both with a single Thunderbolt 2 input but they might try to get a single panel tile to work, which would be more difficult to support in hardware and software but they control all that anyway, unlike Dell.

    The Mac Mini will be dragged along behind the iMac whenever that update is ready because they sell the Thunderbolt displays for use with the Mini too. It'll need Thunderbolt 2 and a GPU capable of driving it. I don't think even Iris Pro is capable of that, it seems to be 3840x2160 at 60Hz maximum. They should be able to drive 5K with 1GB of video memory so that's fine but if they need higher than Iris Pro, they'll either scrap the entry Mini or just say the entry one isn't capable of driving the Thunderbolt display in Retina mode and they'd just run it at half-res, which gives an upsell to the quad-i7 models. They might be able to get even the entry models to drive that resolution, it would just struggle with heavy amounts of content. It would be a good idea to get all models to support Retina displays, the Macbook Air is their best-selling model and if it goes passively cooled and Retina, adding a 5K Retina is a lot to support on such low power.

    What I'm hoping from an October event is updates to the iPad line of course (A8 etc, tactile touch would be great but it could just be minor e.g barometer added and iOS 8, maybe a design shift to match the iPhone with curved glass) but Retina across the whole Mac line. Retina iMac (either 4K or 5K), Thunderbolt 2 and Mini also with TB2. New Mac Pro with up to 14-cores, DDR4 memory up to 128GB (likely not offered by Apple though), new Radeon GPUs with double the double-precision performance and 16GB of video memory. An all new passively cooled Retina Macbook Air announced but may not ship until later on because that's the only one with Broadwell Core M parts. Other Broadwell parts for the rest of the lineup will be later on in 2015.
  • Reply 77 of 104
    Quote:

    Originally Posted by Tallest Skil View Post

     

     

    Question: Thunderbolt 2 doesn’t? I don’t remember offhand about Thunderbolt 3, but then again Intel hasn’t published much on that.

     


    No Thunderbolt 2 does not.  It can support up to 4K, not 5K.

  • Reply 78 of 104
    MarvinMarvin Posts: 15,322moderator
    ted13 wrote: »
     
    Question: Thunderbolt 2 doesn’t? I don’t remember offhand about Thunderbolt 3, but then again Intel hasn’t published much on that.
    No Thunderbolt 2 does not.  It can support up to 4K, not 5K.

    It might have enough bandwidth for 5K but it's pushing it to the limit.

    5120x2880 x 24bpp x 60Hz = 21.2Gbps

    Thunderbolt 2 has ~20Gbps. The displayport 1.3 standard supports up to 8K. Some Macs had a passthrough mode on the port to support higher resolutions but this means the port can't be used for anything else.

    Some parts of the display industry want to push forward to 8K (7680x4320 = 48Gbps).

    The problem is sending uncompressed data over the wire. It can be compressed like how Apple does HDMI over Lightning. There's a standard coming for this:

    http://www.pcworld.com/article/2146462/prepare-for-fake-4k-new-standard-will-silently-compress-pc-tablet-video.html
    http://www.vesa.org/news/vesa-and-mipi-alliance-announce-the-adoption-of-vesas-new-display-stream-compression-standard/

    It says visually lossless (but not actually lossless) compression that saves up to 66% data. The ratio will depend on what's being displayed. It can't use inter-frame compression as this would introduce to much lag, it has to be each individual frame, which limits how much they can save. They'd have to make sure the worst case scenario could still fit over available hardware but they can ensure that by increasing the visual loss. A 66% saving though would allow 8K to go over a 20Gbps connection.
  • Reply 79 of 104
    asciiascii Posts: 5,936member
    Quote:

    Originally Posted by Marvin View Post



    Dell has launched some 5K displays due by Christmas (5120×2880):



    http://www.extremetech.com/computing/189342-dell-unveils-5k-desktop-monitor-with-almost-2x-the-pixels-of-your-puny-4k-display



    What that article says is that some display manufacturers have been putting two separate panels together like tiles and using dual Displayport 1.2 inputs to drive them. It's possible for Apple to use two panel tiles together and drive both with a single Thunderbolt 2 input but they might try to get a single panel tile to work, which would be more difficult to support in hardware and software but they control all that anyway, unlike Dell.

    I don't have any problem with using 2 tiles myself, as long as the driver hides that fact from any higher layers. What we see in Windows with dual tiles is that it appears to the system as two monitors which is the wrong approach I think. But then how is the OS to know that it isn't 2 monitors, e.g. you have daisy chained 1 Thunderbolt display to the back of another? Perhaps that's why Apple only supports specific 4K displays with the new Mac Pro, it has to look at the vendor/model id string in the driver in order to know whether it is 2 tiles on the one display or two genuinely separate monitors. But that wouldn't be an issue with a 5K iMac anyway, they could easily recognise their own hardware.

  • Reply 80 of 104
    MarvinMarvin Posts: 15,322moderator
    Marvin wrote: »
    It can't use inter-frame compression as this would introduce to much lag, it has to be each individual frame, which limits how much they can save.

    Thinking more about this, they could just send the difference between successive frames. So say they send a desktop image over, the display can buffer the frame and the OS takes the difference between the frame it knows the display has in a buffer and the new one, compresses the difference and sends that over. If an image doesn't change more than half the pixels between frames, they can save at least 50% of the bandwidth. Instantaneous large changes in the display would have to flush most of the buffer but would be rare. Typing text would use almost no bandwidth at all. They can store the difference buffers for more than one frame too. If you type for 10 minutes, it can use a single reference frame buffer for the entire 10 minutes. Scrolling pages kind of messes it up but the pixel differences should still be minimal.
    ascii wrote:
    I don't have any problem with using 2 tiles myself, as long as the driver hides that fact from any higher layers.

    If they keep the panels synced, it should be ok. You might get a tear line right down where the panel tiles join in some scenarios like in games if they aren't setup properly.
Sign In or Register to comment.