The new Mac Pro might get Intel's new 28-core 5 GHz Xeon processor

13»

Comments

  • Reply 41 of 50
    cgWerkscgWerks Posts: 2,952member
    tht said:
    Intel simply lies about their TDP ratings now. Maybe we can continue to believe that Intel TDP is the minimum clock rate at which all cores will run sustained for hours, but I can’t see how we can use Intel’s advertised TDP as a gage for anything now. ...

    Personally, my 2013 iMac 27 is aging, I would like to have a bigger display, maybe 32”, 8 TB storage, better sound system, better port access (don’t like reaching around), and a quiet and cool system. I would like more performance of my Core i5 Haswell, but it should come at the cost of a noisier and hotter system. So, I would like take the base 6-core model if Apple lets me load up the storage with that model. Unfortunately, 4 TB storage is the likely max.
    re: Intel and TDP - Yeah, that is something I've learned in the last 6mo to a year as well, which has been adjusting my perspective.

    I guess my question, though, is why doesn't Apple design for reality? Certainly while I might not understand the shift in Intel's specs, people at Apple designing, building, and testing these systems realize the shift that has taken place. They certainly didn't just read Intel's TDP spec, design the system, and then go, 'oh oh, this thing runs way too hot.'

    While higher power systems require special design, they could address anything under 200w by designing the system properly to handle it, I'd think. My gosh, didn't Sony PS3s use like 180w and somehow live in entertainment cabinets and such? It doesn't seem like an unsolvable problem. The problem seems to be that Apple doesn't want to face that reality, and wants to stuff them in even smaller enclosures with some percentage of the previous design's cooling capacity.

    tht said:
    ...
    These are desktop systems. A 500 W system, when running maxed out, is transferring 500 W of heat from the computer to the area around your desk. It’s like running a hairdryer, a countertop microwave, or 5 100W incandescent light bulbs for an hour right on top of your desk. That has consequences in terms of making your room hot, and noisy if the fans aren’t designed right. Apple typically chooses to have systems that uses as little power as possible and generate as little noise as possible.

    For the iMac Pro, Apple isn’t budging from 500 W imo. For the Mac Pro, I think it would be a miracle if they design it for 1000 W. Somewhere around 1500 W is the upper limit. Higher than that, you’ll need to use 220V or 30 amp circuits that electric dryers, electric ovens use. They are not going to design a system that requires you to install a new circuit in your home. (A 4 GPU card crypto miner rig with 250 W GPUs are pretty close to maxxing out 110V, 15 amp circuits typically used in power outlets in the USA.)
    Well, careful on the hairdryer or microwave... those things are more like 1200 to 1500+ watts. But, that's a great point about special wiring. Once you get over 1500w (or even less if you have other devices on the same outlet), you don't just sell them to typical consumers or even pros. Suddenly, you're thinking about the environment (i.e.: A/C), airflow (outside the case, like in server rooms), power requirements and building design, etc.

    Even a high end graphics artist or scientist usually doesn't have an office with specially designed power delivery and cooling.
    edited October 2018
  • Reply 42 of 50
    thttht Posts: 5,441member
    cgWerks said:
    I guess my question, though, is why doesn't Apple design for reality? ... The problem seems to be that Apple doesn't want to face that reality, and wants to stuff them in even smaller enclosures with some percentage of the previous design's cooling capacity.
    I think they do. The iMac 5K basically uses the top of the line 95W CPUs, and the Radeon Pro 580 is basically a 0.9x of a regular desktop 580 (I think). It just that there is a big divergence between what type of Macs they want to produce and what type of Macs people want them to produce. Well, at least from the 2012 to 2017 time frame, where Apple seemed only interested in selling iMacs, MBP and MB devices.

    They left the Mac Pro, Mac mini and MacBook Air to whither away for whatever reasons. It finally looks like they are understanding the PC market better if the rumors of a more “professional” mini and Mac Pro are true. Then, the low end laptop will hopefully be unified and a new iMac introduced in a couple of weeks. I’m just hoping someone writes a book regarding Mac development in the 2010s, because such monumental mistakes deserve to be written down and read about.

    Well, careful on the hairdryer or microwave... those things are more like 1200 to 1500+ watts. But, that's a great point about special wiring. Once you get over 1500w (or even less if you have other devices on the same outlet), you don't just sell them to typical consumers or even pros. Suddenly, you're thinking about the environment (i.e.: A/C), airflow (outside the case, like in server rooms), power requirements and building design, etc.

    Even a high end graphics artist or scientist usually doesn't have an office with specially designed power delivery and cooling.
    I was mostly thinking of 0.8 ft^3 countertop microwaves. The 1.5 ft^3 microwaves are indeed around 1200 W. You’re right about hairdryers. It just seems wrong for those things to be about 1800 W. I thought the small ones would be the most available, but appears most of them, small or large, required >1500 W.

    Cooling is indeed a concern. We had a 128 core, 256 core server racks sitting in an office a decade back or so. After outgrowing the central cooling capability, we had to plumb cooling lines something like 50 ft long to get them to dump heat outside. Now, 2000 cores are in a different building. Cooling is still a challenge. Never mind the stories of all these gaming boxes creating hot rooms...
    cgWerksfastasleep
  • Reply 43 of 50
    cgWerkscgWerks Posts: 2,952member
    tht said:
    cgWerks said:
    I guess my question, though, is why doesn't Apple design for reality? ... The problem seems to be that Apple doesn't want to face that reality, and wants to stuff them in even smaller enclosures with some percentage of the previous design's cooling capacity.
    I think they do. The iMac 5K basically uses the top of the line 95W CPUs, and the Radeon Pro 580 is basically a 0.9x of a regular desktop 580 (I think). It just that there is a big divergence between what type of Macs they want to produce and what type of Macs people want them to produce. Well, at least from the 2012 to 2017 time frame, where Apple seemed only interested in selling iMacs, MBP and MB devices.

    They left the Mac Pro, Mac mini and MacBook Air to whither away for whatever reasons. It finally looks like they are understanding the PC market better if the rumors of a more “professional” mini and Mac Pro are true. Then, the low end laptop will hopefully be unified and a new iMac introduced in a couple of weeks. I’m just hoping someone writes a book regarding Mac development in the 2010s, because such monumental mistakes deserve to be written down and read about.
    And, yet... they run them too hot to be reasonably quiet and/or run them hard for longer periods of time without damaging something. In other words, they seem to be designing them at like 120%, hoping the average use won't often take them over 100% too often, rather than designing them for 80% so they have some margin to even operate OK in less than ideal environments.

    re: rest of the lineup and more recent correction - Yeah, incompetence aside, the only story that makes sense to me is that they were so 'forward thinking' that they underestimated how far along some kind of macOS --> iOS transition plan, and have had to backtrack a bit. It would make sense (in that story) if the Mac lineup became limited to stuff like some laptops and an iMac Pro, maybe some transitional iMacs for the non-pro desktop people who refuse the transition initially.
  • Reply 44 of 50
    The old Mac Pro only supported PCIe 2.0 and we find PCIe 3.0 graphics cards now available, so if Apple only supported 3.0, most people would be fine with that, so the older 4.0 standard would be still ahead of the pack.
  • Reply 45 of 50
    nhtnht Posts: 4,522member
    Kind of a necro thread but it is amusing that much of the whining in October was answered by the $1099 6 core 3.2 Ghz Core i7 mini.
  • Reply 46 of 50
    nht said:
    Kind of a necro thread but it is amusing that much of the whining in October was answered by the $1099 6 core 3.2 Ghz Core i7 mini.
    Actually everyone whining has either bought a PC or is planning on it.

    I'd be curious to see how Greyscale Gorilla's switch to PC has shifted users over to PCs.  His demo of his new machine definitely is inspiring to make that switch.  
  • Reply 47 of 50
    mike fix said:
    nht said:
    Kind of a necro thread but it is amusing that much of the whining in October was answered by the $1099 6 core 3.2 Ghz Core i7 mini.
    Actually everyone whining has either bought a PC or is planning on it.

    I'd be curious to see how Greyscale Gorilla's switch to PC has shifted users over to PCs.  His demo of his new machine definitely is inspiring to make that switch.  
    I didn't even know he was a Mac person to begin with. Funny. Most of the 3D people around me are on PC these days, but I feel like that's always kinda been the case. I bought a maxed out 15"MBP with Vega 20 to get as much mograph/3D work out of it that I can, with the idea I can expand a bit with eGPU. I'm not going to Windows for any reason, ever, unless I was building like a small headless render farm or something.
  • Reply 48 of 50
    cgWerkscgWerks Posts: 2,952member
    fastasleep said:
    I didn't even know he was a Mac person to begin with. Funny. Most of the 3D people around me are on PC these days, but I feel like that's always kinda been the case. I bought a maxed out 15"MBP with Vega 20 to get as much mograph/3D work out of it that I can, with the idea I can expand a bit with eGPU. I'm not going to Windows for any reason, ever, unless I was building like a small headless render farm or something.
    Yeah, much of that industry was always PC, but not necessarily with the best software or even the fastest hardware. I'm an old Electric Image Animation System user, which was Mac only and pretty much the 3D app being used by the industry top folks. And, I also remember that just before Apple switched to Intel, the G5 was smoking even the PC custom builds on render speeds.

    Also, best software can be kind of subjective as well. Sometimes best means superior workgroup capability, or customizability, or compatibility, etc. instead of superior in features or performance. I'm currently learning Autodesk Revit... and it's an incredibly deep app in terms of BIM, but I was doing more advanced 3D solids modeling back in the late 90s, on a Mac, with a much better UI and toolset.

    So, a lot of it depends on specific needs. I can't do Revit on a macOS, so I'm in Bootcamp. There's a lot of PC-only software in the 3D/CAD area, especially the more specialized stuff. And, these days, the custom PCs can run circles around our Macs. If you're going to run a lot of PC software and can get a better PC, then it would seem kind of silly to stick with the Mac. BUT, these things are only part of my workflow, so I need to maximize along more than just one vector, which keeps me on the Mac.

    BTW, I'm pretty happy with my eGPU so far (Blackmagic). And, after installing Turbo Boost Switcher, I've tamed the i7 in my Mac mini as well... at least when I'm going for quiet. I can just barely audibly detect they are running once in a while. Unfortunately, if I turn Turbo Boost back on, and do much of anything, the mini gets noisy kind of quickly (which defeats the purpose of a quiet eGPU like the Blackmagic). The only time I've much heard the Blackmagic, is when the driver in Bootcamp wasn't installing correctly, and it would just randomly spin up to 1600 RPMs until a reboot (normally it runs 500ish RPMs). Then, it sounded like I had a furnace or AC unit sitting on my desk. :smiley: 

    re: render farm - I know some of the people I follow use Amazon or Google cloud computing now, instead of building in-house render farms. If you haven't considered that, you might want to look into it. They don't have to have the money sitting around in hardware, and can scale their needs up or down rather quickly.
    edited February 2019
  • Reply 49 of 50
    cgWerks said:
    fastasleep said:
    I didn't even know he was a Mac person to begin with. Funny. Most of the 3D people around me are on PC these days, but I feel like that's always kinda been the case. I bought a maxed out 15"MBP with Vega 20 to get as much mograph/3D work out of it that I can, with the idea I can expand a bit with eGPU. I'm not going to Windows for any reason, ever, unless I was building like a small headless render farm or something.
    Yeah, much of that industry was always PC, but not necessarily with the best software or even the fastest hardware. I'm an old Electric Image Animation System user, which was Mac only and pretty much the 3D app being used by the industry top folks. And, I also remember that just before Apple switched to Intel, the G5 was smoking even the PC custom builds on render speeds.

    Also, best software can be kind of subjective as well. Sometimes best means superior workgroup capability, or customizability, or compatibility, etc. instead of superior in features or performance. I'm currently learning Autodesk Revit... and it's an incredibly deep app in terms of BIM, but I was doing more advanced 3D solids modeling back in the late 90s, on a Mac, with a much better UI and toolset.

    So, a lot of it depends on specific needs. I can't do Revit on a macOS, so I'm in Bootcamp. There's a lot of PC-only software in the 3D/CAD area, especially the more specialized stuff. And, these days, the custom PCs can run circles around our Macs. If you're going to run a lot of PC software and can get a better PC, then it would seem kind of silly to stick with the Mac. BUT, these things are only part of my workflow, so I need to maximize along more than just one vector, which keeps me on the Mac.

    BTW, I'm pretty happy with my eGPU so far (Blackmagic). And, after installing Turbo Boost Switcher, I've tamed the i7 in my Mac mini as well... at least when I'm going for quiet. I can just barely audibly detect they are running once in a while. Unfortunately, if I turn Turbo Boost back on, and do much of anything, the mini gets noisy kind of quickly (which defeats the purpose of a quiet eGPU like the Blackmagic). The only time I've much heard the Blackmagic, is when the driver in Bootcamp wasn't installing correctly, and it would just randomly spin up to 1600 RPMs until a reboot (normally it runs 500ish RPMs). Then, it sounded like I had a furnace or AC unit sitting on my desk. :smiley: 

    re: render farm - I know some of the people I follow use Amazon or Google cloud computing now, instead of building in-house render farms. If you haven't considered that, you might want to look into it. They don't have to have the money sitting around in hardware, and can scale their needs up or down rather quickly.
    Yeah, there's nothing Windows-only per se that I'm doing, though if I wanted to start using something like Octane I'd be kind of screwed without Nvidia cards, which I am gonna venture a guess is at least one factor why Greyscale Gorilla and others moved to PC. 

    Also you can use a local render farm to speed up Octane realtime rendering (not just for sending off jobs), or have a couple headless renderers for After Effects or any number of other apps that can use distributed rendering and may not necessitate uploading to a cloud service, but yes, for heavier 3D render jobs obviously cloud rendering the way to go for anything larger or not realtime.
    cgWerks
  • Reply 50 of 50
    cgWerkscgWerks Posts: 2,952member
    fastasleep said:
    Yeah, there's nothing Windows-only per se that I'm doing, though if I wanted to start using something like Octane I'd be kind of screwed without Nvidia cards, which I am gonna venture a guess is at least one factor why Greyscale Gorilla and others moved to PC. 

    Also you can use a local render farm to speed up Octane realtime rendering (not just for sending off jobs), or have a couple headless renderers for After Effects or any number of other apps that can use distributed rendering and may not necessitate uploading to a cloud service, but yes, for heavier 3D render jobs obviously cloud rendering the way to go for anything larger or not realtime.
    Oh, that's cool. I'm not doing anything that utilizes real-time render farm type stuff. Although, EIAS's Renderama (network render farm) will speed up non-realtime preview renderings, just like bigger render jobs... so I've used that in the past. In fact, before GPUs got so powerful, that was a really important part of the workflow.
Sign In or Register to comment.