Rumor: Apple could launch new 27-inch iMac with 5K Retina display this fall

1235

Comments

  • Reply 81 of 104
    Quote:

    Originally Posted by Marvin View Post



    Thinking more about this, they could just send the difference between successive frames. So say they send a desktop image over, the display can buffer the frame and the OS takes the difference between the frame it knows the display has in a buffer and the new one, compresses the difference and sends that over.

     

    Then the movie you're watching changes scenes and the entire screen turns into a pixelated mess for a couple seconds until the compressor catches up. Or you switch from Safari to Photoshop and the same things happens. Or you minimize an app with a large window to expose the desktop and... well, you get the idea.

     

    We already know what happens to intra-frame compressed video when there isn't enough bandwidth to support several frames at full pop. It's a bad idea.

  • Reply 82 of 104
    melgrossmelgross Posts: 33,510member
    Question: Thunderbolt 2 doesn’t? I don’t remember offhand about Thunderbolt 3, but then again Intel hasn’t published much on that.


    How? The 30” Cinema Display was the first of its kind. The retina MacBook Pro was the first of its kind. I don’t remember about the panel in the first PowerBook, but the 1920x1200 17” wasn’t used anywhere.

    The 30” Cinema Display debuted at $3,299, remember. Your guess sounds about right.


    See, there you go. Problem solved.

    Ah, none of those problems are solved.

    The 30" screen was, as I remember a cut sown 32" screen. Don't remember about the 17" offhand.

    The $3,299 was a long time ago. In today's dollars it would be close to $4,000.

    Thunderbolt 3 hasn't even been discussed. Maybe in a year, or so. Displayport 3 is a ways off. Sometime next year is my guess.
  • Reply 83 of 104
    melgrossmelgross Posts: 33,510member
    I understand what you're saying, but I don't understand WHY you would want to work that way.

    Why would you want to have tools on one screen and video on another? There's less visual hunting involved if they're both on the same screen.

    Because that the way most high end video, photo and print editing is done. That way you have a full, and unobstructed screen for both image and tools.
  • Reply 84 of 104
    melgrossmelgross Posts: 33,510member
    Marvin wrote: »
    It might have enough bandwidth for 5K but it's pushing it to the limit.

    5120x2880 x 24bpp x 60Hz = 21.2Gbps

    Thunderbolt 2 has ~20Gbps. The displayport 1.3 standard supports up to 8K. Some Macs had a passthrough mode on the port to support higher resolutions but this means the port can't be used for anything else.

    Some parts of the display industry want to push forward to 8K (7680x4320 = 48Gbps).

    The problem is sending uncompressed data over the wire. It can be compressed like how Apple does HDMI over Lightning. There's a standard coming for this:

    http://www.pcworld.com/article/2146462/prepare-for-fake-4k-new-standard-will-silently-compress-pc-tablet-video.html
    http://www.vesa.org/news/vesa-and-mipi-alliance-announce-the-adoption-of-vesas-new-display-stream-compression-standard/

    It says visually lossless (but not actually lossless) compression that saves up to 66% data. The ratio will depend on what's being displayed. It can't use inter-frame compression as this would introduce to much lag, it has to be each individual frame, which limits how much they can save. They'd have to make sure the worst case scenario could still fit over available hardware but they can ensure that by increasing the visual loss. A 66% saving though would allow 8K to go over a 20Gbps connection.

    The problem is that editing on that would be tricky. Global adjustments would work, but any pixel related work would be dicey. We would need some tricky software to substitute a compressed area to uncompressed.
  • Reply 85 of 104
    Quote:

    Originally Posted by mdriftmeyer View Post



    Display Port 1.3 is to be finalized before Fall and support ups to 8k display.

     

    The future is today.

  • Reply 86 of 104
    Originally Posted by melgross View Post

    Ah, none of those problems are solved.

     

    Of course it’s solved. We have the spec ready to accept the display resolution Apple is rumored to want. Manufacture of said display is nothing compared to that.

  • Reply 87 of 104
    melgross wrote: »
    Except what does this resolution match up to? 4k is the new standard. 5k is not. There is experimental work being done on 6k, which may not see the light of day, and 8 k, the purpose of which is debatable, even for full size theaters. I personally prefer 16:10 monitors, but that's just me. But I would rather use another, smaller monitor for the editor than have a 5k for both.

    It should be noted that the Red Epic Dragon digital cinema-quality camera shoots at 6K and is used now on movie productions (the Hobbit movies, for example).
  • Reply 88 of 104

    While Displayport 1.3 is now official, it could take several months for parts to be manufactured.

     

    And it seems more Apple-like to bake DP 1.3 into Thunderbolt 3, which isn't scheduled to debut until Skylake hits (next Summer or so?)

     

    Furthermore, I haven't seen anyone suggesting a graphics card that can drive 5K and be used in a relatively affordable iMac.

    27" iMacs are in widespread use in the graphics and publishing industries, so running the Adobe suites well is a must.

     

    I'd like to see a 5K iMac this fall, just don't see how it's possible.

  • Reply 89 of 104
    Quote:

    Originally Posted by melgross View Post



    Because that the way most high end video, photo and print editing is done.

     

    So? I mean no disrespect, but what difference does it make how anyone else does anything? By that reasoning I would still be cutting tape with a razor blade because "that's how it's done." At the time, it was. Then tools allowing for a better workflow came along and that wasn't the way it was done anymore. I know I'm overstating the comparison, but you get the idea. It's easy to let the comfort of status quo cloud our perception of alternatives.

     

    Quote:
    Originally Posted by melgross View Post



    That way you have a full, and unobstructed screen for both image and tools.

     

    With a 4K screen you could have full, unobstructed views of both on the same display. I agree that having to move stuff around to see what you're working on is a pain in the ass, but with a big, hi-res screen you wouldn't have that problem. The video can be displayed at 100% dead centre with the menu above, tools and palettes on either side and timeline below. To me that beats the heck out of going back and forth between two displays, one to do things and another to see what I did.

     

    For the kind of stuff I do -- photo editing, print layout, video editing, audio mixing -- I'd welcome being able to get rid of another display taking up valuable space. Obviously YMMV.

  • Reply 90 of 104
    Quote:

    Originally Posted by SpamSandwich View Post



    It should be noted that the Red Epic Dragon digital cinema-quality camera shoots at 6K and is used now on movie productions (the Hobbit movies, for example).

     

    Of course, Red is notorious for always just doing whatever the hell they want and leaving it up to the user to figure out how to integrate the product into their workflow. In the early days there was an entire industry built around creating conversion utilities for Red media and metadata!

     

    If Sony or Panasonic (or Arri or Panavision) release 6K products it will mean something. Red doing it is just par for the course! :)

  • Reply 91 of 104
    pbpb Posts: 4,255member

    Excuse me if it was already answered and I missed it, but can anyone estimate what kind of GPU power such an iMac would need? And I mean not only to run general/everyday tasks with acceptable fluidity, but also to play games at least as well as the iMacs of the current generation.

     

    Is that possible in such a thin machine? Or should we get ready for a steep graphics performance decrease?

  • Reply 92 of 104
    MarvinMarvin Posts: 15,320moderator
    Then the movie you're watching changes scenes and the entire screen turns into a pixelated mess for a couple seconds until the compressor catches up. Or you switch from Safari to Photoshop and the same things happens. Or you minimize an app with a large window to expose the desktop and... well, you get the idea.

    We already know what happens to intra-frame compressed video when there isn't enough bandwidth to support several frames at full pop. It's a bad idea.

    The pixelated mess happens with video if you skip to the middle of a block it can't decode fully but I'm only talking about two frames, not a large group of frames.

    The display would have a frame buffer (db), the device sending the data has the same buffer, call it the source buffer (sb).
    The first time, it fills sb with the image and sends it using the visually lossless compression or in two passes (50% each) to db.
    The next time, it compares the new frame with sb and stores the difference or a way to get from sb to the new frame.
    This difference gets sent over the wire to db, which uses the same process to get to the new frame and updates db.

    At all times, it's just two frames - the current and last one and getting from one to the other. During a major change, it rebuilds most or all of the buffer but it's no big deal. It would be impossible for it to break up unless the buffer on the display was flushed but it would know this happened and request a new frame. These changes would be happening in 1/60th of a second.
    frank777 wrote:
    I'd like to see a 5K iMac this fall, just don't see how it's possible.

    Dell should have their 5K display out by the end of the year. The following site says they put two panel tiles together and use two displayport 1.2 ports to drive it:

    http://www.techradar.com/news/computing-components/peripherals/dell-s-new-27-inch-5k-monitor-packs-as-much-details-as-7-monitors-1264376

    There's no reason for Apple to be holding the iMac back, Haswell refresh chips are available.
  • Reply 93 of 104
    Quote:

    Originally Posted by Marvin View Post



    The pixelated mess happens with video if you skip to the middle of a block it can't decode fully

     

    Picking nits, but one doesn't have to skip ahead to disrupt a real-time compressor. If the system doesn't have enough bandwidth to pass at least one block uncompressed, all it takes is for the number of pixels changed from one frame to the next to exceed the processors ability to either recalculate the difference or start a new block. A scene change can be enough, as can a fireball or flash of lightning.

     

    Quote:
    Originally Posted by Marvin View Post



    The display would have a frame buffer (db), the device sending the data has the same buffer, call it the source buffer (sb).

    The first time, it fills sb with the image and sends it using the visually lossless compression or in two passes (50% each) to db.

    The next time, it compares the new frame with sb and stores the difference or a way to get from sb to the new frame.

    This difference gets sent over the wire to db, which uses the same process to get to the new frame and updates db.



    At all times, it's just two frames - the current and last one and getting from one to the other. During a major change, it rebuilds most or all of the buffer but it's no big deal. It would be impossible for it to break up unless the buffer on the display was flushed but it would know this happened and request a new frame. These changes would be happening in 1/60th of a second.

     

    I dunno, I'm out of my depth here, but it *seems* like the system would need a lot of "headroom" to insure that a complete, uncompressed frame could be delivered whenever needed, without warning. If you can't guarantee that the pipe to and from the db can pass at least two uninterrupted uncompressed frames, you're gonna get glitches anytime the number of changed pixels exceeds the threshold, yes?

     

    Also, are current compression devices fast enough to deal with *literally* real-time? As in being able to spit out a finished frame within 1/60 second, every time, without fail?

  • Reply 94 of 104
    Quote:

    Originally Posted by Marvin View Post

    Dell should have their 5K display out by the end of the year. The following site says they put two panel tiles together and use two displayport 1.2 ports to drive it:



    http://www.techradar.com/news/computing-components/peripherals/dell-s-new-27-inch-5k-monitor-packs-as-much-details-as-7-monitors-1264376



    There's no reason for Apple to be holding the iMac back, Haswell refresh chips are available.

     

    Yes, but does that approach really sound like Apple?

     

    Which is more likely for Apple: that they go with dual panels and devote engineering resources to macgyver a temporary solution, or that they wait for a single panel and devote their resources toward advancing Displayport 1.3 on the Mac?

  • Reply 95 of 104
    pbpb Posts: 4,255member
    Quote:

    Originally Posted by Marvin View Post



    Dell should have their 5K display out by the end of the year.

    Marvin, as I expressed previously, for me the big question is how Apple can offer such a machine without melting it down by the GPU heat during intensive tasks. Really I don't see how is this possible today in such a slim enclosure.

  • Reply 96 of 104
    MarvinMarvin Posts: 15,320moderator
    Picking nits, but one doesn't have to skip ahead to disrupt a real-time compressor. If the system doesn't have enough bandwidth to pass at least one block uncompressed, all it takes is for the number of pixels changed from one frame to the next to exceed the processors ability to either recalculate the difference or start a new block. A scene change can be enough, as can a fireball or flash of lightning.

    I dunno, I'm out of my depth here, but it *seems* like the system would need a lot of "headroom" to insure that a complete, uncompressed frame could be delivered whenever needed, without warning. If you can't guarantee that the pipe to and from the db can pass at least two uninterrupted uncompressed frames, you're gonna get glitches anytime the number of changed pixels exceeds the threshold, yes?

    Also, are current compression devices fast enough to deal with *literally* real-time? As in being able to spit out a finished frame within 1/60 second, every time, without fail?

    If there's not enough sustained bandwidth then it will break but with the display, nothing has to compete for bandwidth and it would simply lower the refresh and not show partial frames. Take Thunderbolt 2 for example, it has around 20Gbps. 5K @ 60Hz is around that too but that's maxing it out. But to get the first full frame buffer, they can send 50% on the first refresh cycle and the next 50% on the 2nd - this is like the display operating at 30Hz but it only does this for 1/30th of a second on first connecting the display. On a major shift, almost no change will be the full frame but even if it is, it's no big deal, they do the same 50/50 split for 1/30th of a second. Every other change would be a fraction of the size. Add in some actually lossless compression and it's totally feasible to run 5K over TB2 at 60Hz.

    The compression speed or difference comparison is an issue to consider but hardware is fast enough now to do this. They can compress Face-time streams in real-time H.264/H.265 and the kind used for the display would be nothing as intense as that because it doesn't have to compress 10:1 with inter-frame compression, it just has to get a small amount saved. With the difference comparison plus lossless (e.g PNG-style) I described, there would be very little overhead doing it in hardware. Driving 8K over TB 2 would be trickier but might not be necessary just now - once you go Retina, that's all you need.
    frank777 wrote:
    Which is more likely for Apple: that they go with dual panels and devote engineering resources to macgyver a temporary solution, or that they wait for a single panel and devote their resources toward advancing Displayport 1.3 on the Mac?

    I think it would be ok if it works well enough. Dual panels might be the better option and not just for bandwidth reasons. There's yield to consider and the ability to reliably control the LCD panel electrically. I would expect Apple to go whichever route is the best. If the end user has no perception that there are multiple tiles, it's no big deal. I suggested this as a possible route to making an affordable 4K TV. 24" 1080p IPS displays are under $300, joining four together would give you a 4K ~50" panel for under $1200.

    If there's ever any hint that it's multiple tiles such as graphics tearing down the panel seam or something messed up in software then the single panel route would be better but if it's the only way to make affordable, high quality, large Retina displays, that's the route to go.
    pb wrote:
    for me the big question is how Apple can offer such a machine without melting it down by the GPU heat during intensive tasks. Really I don't see how is this possible today in such a slim enclosure.

    It depends on the intensive task. When gaming, you simply wouldn't run the game at 5K, just 1080p or 1440p as normal and it will stretch the image out. In something like Maya where it has a viewport, the refresh of the viewport isn't that high nor would it be that big but still 4x the pixels so on lower-end hardware, there could be some slow-down. If there is slowdown, the solution would be to just run it at half-res. It's still valuable having a higher resolution display even if a lower res is scaled up to it. 1080p has been tested upscaled to a 4K TV and it looks almost as good as native 4K content and there's no extra overhead.
  • Reply 97 of 104
    Quote:

    Originally Posted by Marvin View Post



    1080p has been tested upscaled to a 4K TV and it looks almost as good as native 4K content and there's no extra overhead.

     

    And that, right there, is why I believe 4K as a consumer delivery format is completely unnecessary.

     

    Note that I'm not saying that present video standards are "good enough" or that we shouldn't be trying to improve them, just that pixel resolution isn't the area in greatest need of improvement.

     

    And, of course, my objection has nothing to do with the value of high-resolution computer displays. In that application there's an actual benefit to having more pixels.

  • Reply 98 of 104
    melgrossmelgross Posts: 33,510member
    Of course it’s solved. We have the spec ready to accept the display resolution Apple is rumored to want. Manufacture of said display is nothing compared to that.

    No, it's not solved. Your saying it doesn't make it so. And we won't see new spec Displayport until sometime in 2015. The chips aren't even designed yet.
  • Reply 99 of 104
    melgrossmelgross Posts: 33,510member
    It should be noted that the Red Epic Dragon digital cinema-quality camera shoots at 6K and is used now on movie productions (the Hobbit movies, for example).

    That's right, 6k. We don't know how long that will be around, as the industry is pushing for 8k.
  • Reply 100 of 104
    melgrossmelgross Posts: 33,510member
    So? I mean no disrespect, but what difference does it make how anyone else does anything? By that reasoning I would still be cutting tape with a razor blade because "that's how it's done." At the time, it was. Then tools allowing for a better workflow came along and that wasn't the way it was done anymore. I know I'm overstating the comparison, but you get the idea. It's easy to let the comfort of status quo cloud our perception of alternatives.


    With a 4K screen you could have full, unobstructed views of both on the same display. I agree that having to move stuff around to see what you're working on is a pain in the ass, but with a big, hi-res screen you wouldn't have that problem. The video can be displayed at 100% dead centre with the menu above, tools and palettes on either side and timeline below. To me that beats the heck out of going back and forth between two displays, one to do things and another to see what I did.

    For the kind of stuff I do -- photo editing, print layout, video editing, audio mixing -- I'd welcome being able to get rid of another display taking up valuable space. Obviously YMMV.

    This was part of my company's business for years. I know others. It's a general way of doing things. Don't knock it unless you're doing it. You can do it with one screen. But with so many screens going 16:9, there's no room for anything other than the video. Bigger screens won't help if they're also 16:9. You realize that, right? And no one is going to want to split the tools into two portions on either side of the screen.

    And what about the timelines? Where do they go?
Sign In or Register to comment.