Video: Putting the iMac Pro thermals to the test

13»

Comments

  • Reply 41 of 51
    dysamoriadysamoria Posts: 3,430member
    macxpress said:
    Speaking of thermals, the new Chicago Apple Store was designed with delusion in mind. It’s not able to withstand US Midwestern cold, and the glass is cracking. Never mind that they apparently didn’t think about icicles falling on people going down the stairs.
    And as noted by others who live in the area...a lot of buildings around there have the same issues and have signs around the building warning them about the possibility for falling icicles. This doesn't appear to be anything new for that area, but because Apple has these signs outside their store all of a sudden its a major issue and Apple has yet again failed.   

    https://forums.appleinsider.com/discussion/comment/3020249/#Comment_3020249

    Folks in Canada are also having the same issues with glass cracking because of the extreme cold temperatures and wind. 

    http://mashable.com/2017/12/28/arctic-blast-record-cold-us-canada-frigid-new-year/#iinuM..lDPqb
    But the others are older buildings, no? Lessons could've been learned. 
     0Likes 0Dislikes 0Informatives
  • Reply 42 of 51
    dysamoriadysamoria Posts: 3,430member
    g-news said:
    Throttling or not, it‘s pretty obvious without having looked at the numbers even that the iMac Pro is another design compromise in a thin enclosure. Running the chip at 90-94°C over extended periods of time is going to cause problems, I can guarantee you that. In a few months to a year, the first heavy users will be sending their units in for repair on either shot cpus or gpus.

    Thermals have been an Apple pitfall for years, they just don‘t want to get it. Meanwhile my cheap-ass HP Omen maxes out around 60°C and still is quiet enough to not be annoying. Of course it looks like a black shoebox, but hey, physics are physics.
    My client issued laptop is a Dell POS and its noisy fans are constantly turning on and off, even while idle. No, I think Apple gets thermals very well. 
    Depends on your focus. I hear my friends' PC laptops doing the same and the 'hard work" these machines are asked to do amounts to web browsers and backups and email. They sound ridiculously loud, and then I sit there doing music on my MacBook and it only ramps up the fan when a Reaktor ensembles pegs the CPU at 100% for a bit... which was an out of hand situation that I ended ASAP because it was a badly optimized ensemble running at 140KHz (typically the audio processing is done at 44KHz). I don't know how my friends' laptops web browsers, mail, and backup programs are demanding so much CPU. They probably aren't, just that the thermals suck and these laptops have only two speeds: fast and slow.

    If I was doing 3D rendering, I'd be leaving the room (because that's time I don't need to sit there watching the computer) and wouldn't be annoyed by a loud machine. I WOULD be annoyed if there was a loss of CPU performance just to ensure that the computer's limited tolerances for heat and noise were followed. That's time lost on rendering.
     0Likes 0Dislikes 0Informatives
  • Reply 43 of 51
    cgWerkscgwerks Posts: 2,952member
    dysamoria said:
    ... I don't know how my friends' laptops web browsers, mail, and backup programs are demanding so much CPU. They probably aren't, just that the thermals suck and these laptops have only two speeds: fast and slow.

    If I was doing 3D rendering, I'd be leaving the room (because that's time I don't need to sit there watching the computer) and wouldn't be annoyed by a loud machine. I WOULD be annoyed if there was a loss of CPU performance just to ensure that the computer's limited tolerances for heat and noise were followed. That's time lost on rendering.
    It's probably a combo of crumby software and poorly designed thermals, sensors, etc. I've seen Windows machines just jump up to 100% for just about anything... sometimes seemingly attempting to update the mouse cursor position and such, heh.

    Well, if you know you're just going to max a machine out for hours... maybe you'd just leave it alone and not care about noise. But, part of the attraction of these kinds of machines (at least for me), is that maybe I could run folding@home on a few cores, have a rendering going on another few cores, while I use yet another few cores to actually work on my project... while another core or two are running a virtual machine, etc. etc.

    Yes, if I were *just* going to full out use all the cores to crunch something, that might be better off loaded to a dedicated machine (if available), or even setup these days on Amazon or Google cloud computing (i.e.: a bunch of computing resources on demand where you just pay for what you use, rather than having local expensive equipment that is idle much of the time).

    And, some will care about noise more than others. If you're in a relatively noisy office environment anyway, maybe it's not a huge deal. If you're in your home office listening to some music while you work or recording YouTube videos, maybe the noise wouldn't be welcomed. But, it sounds like for most applications the noise wouldn't be an issue unless you're relaying pushing it... which is good news.

    My only concern now (aside from the iMac not being the best form-factor for me) is any impact of the relatively high heat on surrounding components. I suppose we won't know that for years, though.
     0Likes 0Dislikes 0Informatives
  • Reply 44 of 51
    welshdogwelshdog Posts: 1,928member
    dws-2 said:
    the internal core temperature just gets too hot after a while, and no reasonable cooling system would help.
    This seems likely.
     0Likes 0Dislikes 0Informatives
  • Reply 45 of 51
    welshdogwelshdog Posts: 1,928member
    mike54 said:
    These test were done in ambient room temp of 70F (21C) . That's quite cool. 
    The performance is going to drop off and the fans are going to be definitely more than audible when its  >90F (32C).
    I think the iMac Pro is designed to be used in a well air-conditioned room.
    When I managed the equipment at a big video post facility, we kept ALL Macs doing profit center type work in a big cooled machine room.  We used fiber optic extenders to get keyboard, mouse, Wacom and dual DVI monitors to the various edit suites - some of which were 200 feet from the machine room. If you want your "pro" machine to work reliably and keep working you put it in a cool room.  Operating a high strung computer in 90º F is asking for trouble.  While our computers were all old style Mac Pros, the same would be true for a high power iMac.  Keep it in a cool room and it (and you) will be happy.
     0Likes 0Dislikes 0Informatives
  • Reply 46 of 51
    Marvinmarvin Posts: 15,578moderator
    cgWerks said:
    Marvin said:
    It's pretty rare that any task that's being run on a highly muti-core machine will use exactly 100% of the cores at all times, dropping potentially 10% is meaningless. 
    I'm not sure I'd go quite that far. I could run Folding@home and use them all up 24x7, or fire up some 3D rendering engines and easily put them all at 100% for days too. I suppose it could be argued that this isn't what this machine is for. But, being a 'pro' machine, it should be able to handle it.... which it seems it can, just at some performance degradation. (If it doesn't hurt surrounding components.)

    That's fine, so long as it is understood. But, if you are a pro comparing two machines and one gives 100% and the other 90%, that's over 1 month of a year in potential difference. Most people just aren't going to actually experience it because their work won't push it there enough.
    The main performance drop was noted when both CPU and GPU were maxed out, 3D and Folding@home typically use one or the other. If you force the machine to use both at once for a long time then you are creating a problem situation and complaining about it, same with maxing it in a warm environment. The video states that this is not a test that reflects realistic scenarios, it's testing the limits of the machine. A more real-world test would be to see the performance drop on the CPU and GPU separately over a period of an hour or two. With only CPU or GPU at max load, you are cutting the overall load nearly in half. Also important to note is that if it drops a certain amount after a task is run, that doesn't matter at all if the task is finished.

    As for it being a month out of the year difference, if you are planning to buy a $5k+ machine to have to sitting unused while it churns away with both CPU and GPU maxed out for an entire year then you probably have bought the wrong machine for the task. If you want a rendering node then buy a rendering node or rent a server, you don't need a 5K display and you can buy more than one GPU if you want GPU computing.

    This machine will be used in workflows where people render out sequences for a few hours at a time maximum using either CPU or GPU and for fast real-time processing like 8K frames and it will handle this very well.
    edited December 2017
     0Likes 0Dislikes 0Informatives
  • Reply 47 of 51
    dysamoriadysamoria Posts: 3,430member
    So this is going to be a bad choice for gaming. Just like my MacBook Pro 3,1 (killed itself from thermal issues, mostly the notable GPU issue in that model). This was never going to be a proper gaming system and it surprises nobody. Still, I would like to be able to dual boot to Windows for moderate gaming, without worrying that it'll kill my computer (i won't be buying a gaming PC, but I do want to be able to play some stuff that's come out over the last five years).

    i really do appreciate the quiet aspect of my Macs. I mostly do music and photography, and neither task is one I want to listen to noise during. I hate noise, especially when it's not the result of intentionally pushing the system (like my previously mentioned friends' PC laptops, and my room-heating PC, which heats the room even when it's idle). My use case fits the "quiet machine" model, but I'm not confident about the longevity of the machine due to heating and cooling stress. I'm not comfortable with such narrow tolerances. I don't appreciate being victim to a computer that has one component that ends up becoming notorious for failure due to thermals. I've been down that road before with a Mac and don't want it again, especially since the next new Mac I buy may be the last computer I ever buy due to spending the last of my money on it. (no one cares about the arts anymore, so the machine likely won't be paying my bills)

    I want something engineered to last, like the old Mac Pro models (which I was tempted to buy used, but they're just too far behind from a computer evolution stance). Even better: something that's repairable for when something does die... repairable within the budget of a normal person, without replacing more than 50% of the whole system to replace one bad component... affordable to someone who doesn't make his living owning a render farm, with the money to dispose of three-year-old machines because they've been superseded by something 5% faster. If companies can afford to do that, well, great for them, but it's still awful to make disposable products, regardless.

    I won't be that kind of "Pro" and I don't see why that's the only kind of "Pro" Apple should serve. If they allow Wall Street mentality to drive them to sell only intentionally wasteful products, then that's a bad company. Even a company replacing a fleet of computers can hold on to their existing displays. A well-made display can last the life of multiple computers, especially for companies that frequently replace their computers for processing power increases, not marginal display improvements. Even for artists, displays aren't changing enough to justify spending another $1200 every three years. Until and unless we get a retina-resolution micro LED display, there just won't be a reason for artists to toss a working 27" Retina display. After we get such a display (retina, micro LED), display improvements will again return to "infrequent".

    I'm still waiting for the proper machine that is a mix of elegance and long-lasting power. Apple has done it before, and they can do it again. The iMac Pro isn't it. It's a needlessly expensive all-in-one, with clear downsides for people without disposable income, and also for people with a business need to utilize every drop of power (a PC workstation that is similarly priced/specified will perform better on high power tasks because it won't be throttled).  Due to an intentionally cramped design and disposability, this a luxury product that can impress at some tasks but not on others. Anyone who buys one of these and is well-served by it, I'm happy for them. I wouldn't refuse one donated to me, but it's definitely not a remotely sensible investment for me (and no one is donating one to me, so I still need to wait for Apple to present the right product to me).
     0Likes 0Dislikes 0Informatives
  • Reply 48 of 51
    cgWerkscgwerks Posts: 2,952member
    welshdog said:
    While our computers were all old style Mac Pros, the same would be true for a high power iMac.  Keep it in a cool room and it (and you) will be happy.
    Easier said than done, though. I've worked for Fortune 100 companies that couldn't keep consistent temps in our office areas. A server room isn't the same as typical office space. But, yea, 90°F seems a bit excessive, unless you have no access to any kind of climate control... 75-80°F wouldn't be uncommon though.

    Marvin said:
    The main performance drop was noted when both CPU and GPU were maxed out, 3D and Folding@home typically use one or the other. If you force the machine to use both at once for a long time then you are creating a problem situation and complaining about it, same with maxing it in a warm environment. The video states that this is not a test that reflects realistic scenarios, it's testing the limits of the machine. A more real-world test would be to see the performance drop on the CPU and GPU separately over a period of an hour or two. With only CPU or GPU at max load, you are cutting the overall load nearly in half. Also important to note is that if it drops a certain amount after a task is run, that doesn't matter at all if the task is finished.

    As for it being a month out of the year difference, if you are planning to buy a $5k+ machine to have to sitting unused while it churns away with both CPU and GPU maxed out for an entire year then you probably have bought the wrong machine for the task. If you want a rendering node then buy a rendering node or rent a server, you don't need a 5K display and you can buy more than one GPU if you want GPU computing.

    This machine will be used in workflows where people render out sequences for a few hours at a time maximum using either CPU or GPU and for fast real-time processing like 8K frames and it will handle this very well.
    Fair enough, and I agree that real-world, you're seldom going to max both for extended periods of time. Though, maxing one, with partial engagement of the other might not be uncommon at all... and maybe the cooling is totally adequate for that w/o impact, even long-term.

    re: month - I was just noting that a 10% performance difference is bigger than people realize when you start putting it into those kinds of terms (I've seen people switch hardware for less than 10%), but I agree that it would be hard to actually experience that 10% difference here. Maybe more like a percent or two in the most demanding environments, would be my guess.

    That said, Apple doesn't really fill that spot well until the new Mac Pro comes out, so it's a legitimate complaint in trying to stay with the platform. The rendering software I use can run on several platforms or be put into cloud computing, but that isn't the case for some things.

    I'm also playing devil's advocate a bit here, as this machine would be ideal to overkill for me and is currently above my budget anyway. My involvement in this thread comes more from my interest in pro hardware and background with it... and a bit of dreaming, until Apple provides something between entry-level and this. Aside from there being no alternate source video inputs, this is a dream machine for me.

    dysamoria said:
    So this is going to be a bad choice for gaming. Just like my MacBook Pro 3,1 (killed itself from thermal issues, mostly the notable GPU issue in that model). This was never going to be a proper gaming system and it surprises nobody. Still, I would like to be able to dual boot to Windows for moderate gaming, without worrying that it'll kill my computer (i won't be buying a gaming PC, but I do want to be able to play some stuff that's come out over the last five years).

    i really do appreciate the quiet aspect of my Macs. I mostly do music and photography, and neither task is one I want to listen to noise during. I hate noise, especially when it's not the result of intentionally pushing the system (like my previously mentioned friends' PC laptops, and my room-heating PC, which heats the room even when it's idle). My use case fits the "quiet machine" model, but I'm not confident about the longevity of the machine due to heating and cooling stress. I'm not comfortable with such narrow tolerances. I don't appreciate being victim to a computer that has one component that ends up becoming notorious for failure due to thermals. I've been down that road before with a Mac and don't want it again, especially since the next new Mac I buy may be the last computer I ever buy due to spending the last of my money on it. (no one cares about the arts anymore, so the machine likely won't be paying my bills) ...
    I agree with much of what you said, and I'm in a similar situation right now. And, I think the industry is moving increasingly in that direction (i.e.: independent and small creative agencies, instead of big corporate operations). This is more clearly aimed at the latter.

    You're also correct, that while no one in their right mind is going to buy one of these for gaming, someone buying one for their day-job might also like to do a bit of gaming at night w/o having to also invest in a gaming rig, etc.

    And, a lot of home offices or small business applications do care about noise, so I'm glad to see this thing doesn't get too noisy unless it is really pushed (and in a way that would be a bit unnatural for normal workflows). They seem to have done a good job there, so long as there isn't thermal damage occurring anyway... which I'm not sure we'll now for a long time.

    I've also been burned by that (pun intended), so that's why I'm questioning so much in that area. I've lost 2 MacBook Pros prematurely over the years, personally... and I've seen a few others that I had to deal with in my corporate IT roll (responsible for a dozen or so developer machines), so it wasn't exactly an uncommon issue.
     0Likes 0Dislikes 0Informatives
  • Reply 49 of 51
    For me as a working professional, I don't mind slightly lower performance for much quieter fans. The quieter fans are a big reason I opted to get an iMac Pro, and I have not been disappointed... it's not like the performance drop is that much. I agree it will be nice when people can opt to maximize fans for increased performance but I probably would not take that route.
     0Likes 0Dislikes 0Informatives
  • Reply 50 of 51
    cgWerkscgwerks Posts: 2,952member
    kgelner01 said:
    For me as a working professional, I don't mind slightly lower performance for much quieter fans. The quieter fans are a big reason I opted to get an iMac Pro, and I have not been disappointed... it's not like the performance drop is that much. I agree it will be nice when people can opt to maximize fans for increased performance but I probably would not take that route.
    If you are in a quite office, is 'virtually silent' really that you can't hear the fans? Or, is it more you know they are running, but it is at a pretty low level?
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.