The high-end 27-inch iMac is more 'pro' than the iMac Pro is

2»

Comments

  • Reply 21 of 39
    elijahgelijahg Posts: 2,634member

    Let me know when it’s as easy to replace iMac components as the Mac Pro. That’ll never happen with this current case design.
    There's no reason it can't be almost as easy as the Mac Pro, but Apple has intentionally designed it to be difficult. There's no need to glue the screen on, they could use magnets as they did with the pre-2012 models, hell the bezel is big enough. Even the Apple techs don't like working on them, I had an issue with my 2012 one about 3 months in (noisy PSU) and the Apple Store just replaced the entire machine rather than the hassle of taking it to bits. Ridiculous. 
    SpamSandwichcgWerks
  • Reply 22 of 39
    elijahgelijahg Posts: 2,634member

    Let me know when it’s as easy to replace iMac components as the Mac Pro. That’ll never happen with this current case design.
    And is not its use case. Different machines for different use cases. I’m a pro user (software devs are Apple largest pro market per Craig) and have little need for a tower or swappable components. Simply does not matter to me. 

    Great, and what about Macs that need service? There's a lot more to it than just your specific use case, whether it be "pro" or not.   Doesn't mean Apple has to intentionally make it hard to service. 
    edited August 2020 muthuk_vanalingam
  • Reply 23 of 39
    canukstormcanukstorm Posts: 2,590member
    elijahg said:

    Let me know when it’s as easy to replace iMac components as the Mac Pro. That’ll never happen with this current case design.
    And is not its use case. Different machines for different use cases. I’m a pro user (software devs are Apple largest pro market per Craig) and have little need for a tower or swappable components. Simply does not matter to me. 

    Great, and what about Macs that need service? There's a lot more to it than just your specific use case, whether it be "pro" or not.   Doesn't mean Apple has to intentionally make it hard to service. 
    Apple's solution is to take it in to Apple Retail or an Apple Authorized Service Provider
    watto_cobra
  • Reply 24 of 39
    saareksaarek Posts: 1,397member
    The worst thing about the "Pro" macs is not the price, it's that the money is not well spent. Dollar for dollar, the AMD Threadripper CPUs are much better value and offer much better performance than similar Intel Xeon CPUs. Similarly, a top end NVIDIA GPU blows the doors off the AMD GPUs in a Mac Pro for tasks like AI or ray tracing. It's like Apple went for pro sounding components rather than pro performance components.
    What you say is not far from the truth. The threadripper architecture is bitch slapping the Intel Xeon left right and centre and NVidia arguably makes more powerful GPU’s.

    But, Apple has a long standing position of not using NVidia GPU’s. It’s almost childish actually that Apple refuses to use them, but if Apple decides someone is out in the cold then that’s that.

    As to Threadripper, well it is superior for now, but they have a contract with Intel, they are also preparing to dump Intel and move to Apple Silicone and I suppose for their roadmap it’s not worth the effort. Plus we don’t know what deal they signed with Intel, there could be a clause in there to only use Intel CPU’s for a better price or something.
    edited August 2020 PylonsXedwatto_cobra
  • Reply 25 of 39
    DuhSesameDuhSesame Posts: 1,258member
    The worst thing about the "Pro" macs is not the price, it's that the money is not well spent. Dollar for dollar, the AMD Threadripper CPUs are much better value and offer much better performance than similar Intel Xeon CPUs. Similarly, a top end NVIDIA GPU blows the doors off the AMD GPUs in a Mac Pro for tasks like AI or ray tracing. It's like Apple went for pro sounding components rather than pro performance components.
    Like what?  28-core is in the bottom 1% of all computers?  
    Some of you never get it, an organization/studio, with their levels of affordability, isn't going to care one specific price-to-performance ratio, they'll earn back quicker than what their spends.  What's also important is stability and this is where AMD still lags behind, your favorite LTT covered this topic up.
    Xeon workstations were still sold by many OEMs, that's not without reasons.
    cgWerkswatto_cobra
  • Reply 26 of 39
    elijahgelijahg Posts: 2,634member
    elijahg said:

    Let me know when it’s as easy to replace iMac components as the Mac Pro. That’ll never happen with this current case design.
    And is not its use case. Different machines for different use cases. I’m a pro user (software devs are Apple largest pro market per Craig) and have little need for a tower or swappable components. Simply does not matter to me. 

    Great, and what about Macs that need service? There's a lot more to it than just your specific use case, whether it be "pro" or not.   Doesn't mean Apple has to intentionally make it hard to service. 
    Apple's solution is to take it in to Apple Retail or an Apple Authorized Service Provider
    I was referring to the difficulty for Apple's own techs, the my prior post said that Apple replaced my entire 2012 iMac rather than take it apart to fix the PSU. And what about when it's past its feeble 1 year warranty?
  • Reply 27 of 39
    canukstormcanukstorm Posts: 2,590member
    saarek said:
    The worst thing about the "Pro" macs is not the price, it's that the money is not well spent. Dollar for dollar, the AMD Threadripper CPUs are much better value and offer much better performance than similar Intel Xeon CPUs. Similarly, a top end NVIDIA GPU blows the doors off the AMD GPUs in a Mac Pro for tasks like AI or ray tracing. It's like Apple went for pro sounding components rather than pro performance components.
    What you say is not far from the truth. The threadripper architecture is bitch slapping the Intel Xeon left right and centre and NVidia arguably makes more powerful GPU’s.

    But, Apple has a long standing position of not using NVidia GPU’s. It’s almost childish actually that Apple refuses to use them, but if Apple decides someone is out in the cold then that’s that.

    As to Threadripper, well it is superior for now, but they have a contract with Intel, they are also preparing to dump Intel and move to Apple Silicone and I suppose for their roadmap it’s not worth the effort. Plus we don’t know what deal they signed with Intel, there could be a clause in there to only use Intel CPU’s for a better price or something.
    Once upon a time, Apple did use Nvidia GPU's.  They were overheating to the point that laptops were exploding and Nvidia failed to compensate Apple for faulty GPU's.  I would have taken the same action that Apple did if one of my vendors acted in a shitty manner.
    cgWerksmuthuk_vanalingamjoshuachungwatto_cobra
  • Reply 28 of 39
    davgreg said:
    It is pretty obvious by now that Apple is more of a lifestyle company that offers a computer line.

    There is a market above the Mac Mini and below the Mac Pro that wants a conventional tower and not an all in one that is glued shut.
    I agree that there is a market, and it is a high percentage of the AI audience. Hell, I want one.

    What it is not, is a large portion of the overall computing market.
    I wish Apple would revive the G4 Cube in form of a mid range tower with maybe 2  expansion slots, but that is never going to happen 
    watto_cobra
  • Reply 29 of 39
    cgWerkscgWerks Posts: 2,843member
    RAM aside, the key question is really going to be over how that processor compares to the iMac Pro's one in real-world use, and especially under continuous heavy load. It's not known yet whether the new iMac has adopted the cooling system of the iMac Pro and that becomes more and more important for high-end pro use.
    ...
    The iMac Pro has that "Pro" branding. The 27-inch iMac does not. But, today, you can get roughly equivalent power at the low-end of the iMac Pro line, with the high-end of the 27-inch iMac Pro.
    That's the thing, though. If it doesn't have the cooling, for a lot of pro use it doesn't matter if it matches or even exceeds the iMac Pro in performance.

    rcfa said:
    Anyone who claims one doesn’t need ECC RAM is as stupid as someone who says one doesn’t need a backup drive.

    Frankly, it’s disgusting that there are computers at all , that aren’t embedded in toddlers toys, that ship without ECC standard.

    The only reason for this is, that in the PC market competition for way too long has mainly be over the price, and thus penny pinching in everything that’s not obvious to users at first sight, has become the norm.
    I do agree it is a bit silly everything doesn't have ECC RAM, but most of my machines over several decades now haven't had it and I'm doing OK. I suppose an occasional (once per year?) crash or lock-up could have been avoided? Maybe? Or, is this a bit more like the spare-tire thing where there are dozens of other issues that could leave you stranded even with a spare.

    Mike Wuerthele said:
    I agree that there is a market, and it is a high percentage of the AI audience. Hell, I want one.
    What it is not, is a large portion of the overall computing market.
    Neither is the Mac Pro, or iMac Pro, or a number of other products Apple makes. Does Apple just decide to leave a chunk of the market out for some reason? Or, do they actually think they've got it covered by existing models? That's the real question, IMO.
    elijahgFileMakerFellerwatto_cobra
  • Reply 30 of 39
    cgWerkscgWerks Posts: 2,843member
    saarek said:
    What you say is not far from the truth. The threadripper architecture is bitch slapping the Intel Xeon left right and centre and NVidia arguably makes more powerful GPU’s.
    Isn't that only true in certain benchmarks, though or using certain technologies like CUDA? For example, I saw a real-world rendering test with C4D not long ago where the Intel beat the Threadripper. And, I don't use CUDA, so is Nvidia really that much ahead aside from that? (And, maybe it was an overall deal, but Microsoft and Sony picked AMD for the GPUs in their new generation consoles.)

    DuhSesame said:
    What's also important is stability and this is where AMD still lags behind, your favorite LTT covered this topic up.
    Yeah, while it has been years ago, that was my issue when I tried AMD. I had wondered how that has improved, or not.
    While I'm sure it isn't universal, most of the conversation I run across about Threadrippers seems to be among computer-tweaker, speed-racer hobbyists, not pros.

    canukstorm said:
    Once upon a time, Apple did use Nvidia GPU's.  They were overheating to the point that laptops were exploding and Nvidia failed to compensate Apple for faulty GPU's.  I would have taken the same action that Apple did if one of my vendors acted in a shitty manner.
    Yep, I had a couple of those. One Apple fixed, the other ended up being a word-processor for a college student, sitting on top of fans to keep it from freezing. But, they've had trouble with AMD as well... did AMD do a better job of covering them? Either way, I kind of blame Apple more in the matter for either, as they always skimp on cooling.

    fred cintra said:
    I wish Apple would revive the G4 Cube in form of a mid range tower with maybe 2  expansion slots, but that is never going to happen 
    There are definitely the people who want their slots and serviceability. I just want a machine big enough that it can have the higher end consumer components, with adequate cooling to actually be used heavily. One wouldn't think that should be too much to ask!
    muthuk_vanalingamelijahg
  • Reply 31 of 39
    saarek said:
    The worst thing about the "Pro" macs is not the price, it's that the money is not well spent. Dollar for dollar, the AMD Threadripper CPUs are much better value and offer much better performance than similar Intel Xeon CPUs. Similarly, a top end NVIDIA GPU blows the doors off the AMD GPUs in a Mac Pro for tasks like AI or ray tracing. It's like Apple went for pro sounding components rather than pro performance components.
    What you say is not far from the truth. The threadripper architecture is bitch slapping the Intel Xeon left right and centre and NVidia arguably makes more powerful GPU’s.

    But, Apple has a long standing position of not using NVidia GPU’s. It’s almost childish actually that Apple refuses to use them, but if Apple decides someone is out in the cold then that’s that.

    As to Threadripper, well it is superior for now, but they have a contract with Intel, they are also preparing to dump Intel and move to Apple Silicone and I suppose for their roadmap it’s not worth the effort. Plus we don’t know what deal they signed with Intel, there could be a clause in there to only use Intel CPU’s for a better price or something.
    Aren’t threadrippers relatively limited on how much RAM they can access compared to Xeons?
    watto_cobra
  • Reply 32 of 39
    DuhSesameDuhSesame Posts: 1,258member
    cgWerks said:

    DuhSesame said:
    What's also important is stability and this is where AMD still lags behind, your favorite LTT covered this topic up.
    Yeah, while it has been years ago, that was my issue when I tried AMD. I had wondered how that has improved, or not.
    While I'm sure it isn't universal, most of the conversation I run across about Threadrippers seems to be among computer-tweaker, speed-racer hobbyists, not pros.


    We're commenting a workstation like a custom-build PC, that's why they were unsatisfied.  It's about fits what their customer's need and these people values stability above anything else.

    If anyone is preparing to buy a Mac for workstation, I'd assume one of the reason would be the ecosystem, for that CUDA/Adobe would be secondary.

    Also, some people (probably not you) need to build their PCs.  Having a custom build workstation that's as silent as the Mac Pro is extremely hard to achieve.

    fred cintra said:
    I wish Apple would revive the G4 Cube in form of a mid range tower with maybe 2  expansion slots, but that is never going to happen 
    There are definitely the people who want their slots and serviceability. I just want a machine big enough that it can have the higher end consumer components, with adequate cooling to actually be used heavily. One wouldn't think that should be too much to ask!

    Who knows, with ASi I think they could even have special products for their own niches, it's not impossible to mix an All-in-One with slots.

    Or if they believe external solutions will be the future (I have a buddy who thinks that) for the consumer world, we'll most likely to see dedicated eGPUs or so.

    edited August 2020 watto_cobra
  • Reply 33 of 39
    ivanhivanh Posts: 597member
    still utmost terrible Bluetooth connectivity to wireless headphones! hopeless!
  • Reply 34 of 39
    sflocalsflocal Posts: 5,993member
    If the reviews come in good, I'm punching the new iMac.  My only wish is that it had 4 TB3 ports, and an option to have the Space Grey finish like the iMac Pro.

    Very excited.  This looks like a great machine.
    watto_cobra
  • Reply 35 of 39
    neilmneilm Posts: 957member
    I bought an iMac Pro for the office last year — 10-core, 64GB/1TB, $7500 or so. It's used as a video editing/processing workstation. We also have a bunch of well-optioned regular 27" iMacs.

    The reason to shell out the big bucks for the iMac Pro isn't just performance, it's endurance. The improved cooling system allows it to crank at full speed indefinitely, making it the tool of choice (other than the newer Mac Pro) for long duration jobs such as rendering, or video transcoding. The regular iMac doesn't have the thermal headroom and has to throttle back the processor clock speed for long jobs.

    Due to the unchanged RAM slot location it seem obvious that the just-released 27" iMac — and there's one being delivered to our office today — can not have adopted the iMac Pro's upgraded cooling system. Whether Apple has made any other cooling system changes remains to be seen. I'm betting not, because otherwise they would have claimed improved cooling as a benefit — and they haven't.

    However both the 10th gen processors and the newer video cards should be more efficient, possibly helping out with the thermal load. Offsetting that are the higher core counts.
    edited August 2020 muthuk_vanalingamFileMakerFellerwatto_cobra
  • Reply 36 of 39
    DuhSesameDuhSesame Posts: 1,258member
    cgWerks said:
    saarek said:
    What you say is not far from the truth. The threadripper architecture is bitch slapping the Intel Xeon left right and centre and NVidia arguably makes more powerful GPU’s.
    Isn't that only true in certain benchmarks, though or using certain technologies like CUDA? For example, I saw a real-world rendering test with C4D not long ago where the Intel beat the Threadripper. And, I don't use CUDA, so is Nvidia really that much ahead aside from that? (And, maybe it was an overall deal, but Microsoft and Sony picked AMD for the GPUs in their new generation consoles.)

    DuhSesame said:
    What's also important is stability and this is where AMD still lags behind, your favorite LTT covered this topic up.
    Yeah, while it has been years ago, that was my issue when I tried AMD. I had wondered how that has improved, or not.
    While I'm sure it isn't universal, most of the conversation I run across about Threadrippers seems to be among computer-tweaker, speed-racer hobbyists, not pros.

    canukstorm said:
    Once upon a time, Apple did use Nvidia GPU's.  They were overheating to the point that laptops were exploding and Nvidia failed to compensate Apple for faulty GPU's.  I would have taken the same action that Apple did if one of my vendors acted in a shitty manner.
    Yep, I had a couple of those. One Apple fixed, the other ended up being a word-processor for a college student, sitting on top of fans to keep it from freezing. But, they've had trouble with AMD as well... did AMD do a better job of covering them? Either way, I kind of blame Apple more in the matter for either, as they always skimp on cooling.

    fred cintra said:
    I wish Apple would revive the G4 Cube in form of a mid range tower with maybe 2  expansion slots, but that is never going to happen 
    There are definitely the people who want their slots and serviceability. I just want a machine big enough that it can have the higher end consumer components, with adequate cooling to actually be used heavily. One wouldn't think that should be too much to ask!
    Just checked out what this 64-core threadripper really is.  I didn't pay much attention besides knowing it "beats Intel at an incredible price"
    turns out it's more of a competitor like 3175X rather than being an "actual" workstation.  

    512GiB of RAM support makes a key difference as bigger memory controller charges you a ton.

    https://en.wikichip.org/wiki/amd/ryzen_threadripper/3990x

    Also notice how AMD doesn't make anything between 32 and 64 cores?  This suggests it's primary purpose is mostly a show-piece for the platform that rarely sells (by comparison).

    Same thing can be said about their 16-Core 3950X.

    https://www.amd.com/en/products/cpu/amd-ryzen-threadripper-3990x
    https://www.amd.com/en/products/cpu/amd-ryzen-9-3950x
  • Reply 37 of 39
    cgWerkscgWerks Posts: 2,843member
    DuhSesame said:
    Also, some people (probably not you) need to build their PCs.  Having a custom build workstation that's as silent as the Mac Pro is extremely hard to achieve.
    Yeah. I like saving money, but I'm just not at a point where building a PC is a major attraction. I did it for years, including servers back in the day (when companies went though a 'build your server out of custom parts' phase, which ended rather badly).

    You wouldn't think making them quiet should be so hard. You just need a bunch of fairly big fans that move slowly and are good quality. And, it helps to move the air vertically. But, other than that, there isn't anything too horribly special about the 2013 MP or my Blackmagic eGPU that I can figure out. It does take up a lot more space to go vertical, unless you design your own boards/parts, I guess.

    DuhSesame said:
    Who knows, with ASi I think they could even have special products for their own niches, it's not impossible to mix an All-in-One with slots.Or if they believe external solutions will be the future (I have a buddy who thinks that) for the consumer world, we'll most likely to see dedicated eGPUs or so.
    Yeah, I think I'm in that camp too. I just don't see Apple competing with Nvidia or AMD on GPUs they'll stick in most of their machines (if any). We'll probably be adding eGPUs or buying a Mac Pro. That said, there may be less of a reason to, for a lot of people who aren't doing stuff like 3D, CAD, or gaming. If Apple accelerates certain key functions like the T2 has done with video encoding, their 'iGPU' might be plenty for a lot more stuff than current iGPUs are.
  • Reply 38 of 39
    DuhSesameDuhSesame Posts: 1,258member
    cgWerks said:
    DuhSesame said:
    Also, some people (probably not you) need to build their PCs.  Having a custom build workstation that's as silent as the Mac Pro is extremely hard to achieve.
    Yeah. I like saving money, but I'm just not at a point where building a PC is a major attraction. I did it for years, including servers back in the day (when companies went though a 'build your server out of custom parts' phase, which ended rather badly).

    You wouldn't think making them quiet should be so hard. You just need a bunch of fairly big fans that move slowly and are good quality. And, it helps to move the air vertically. But, other than that, there isn't anything too horribly special about the 2013 MP or my Blackmagic eGPU that I can figure out. It does take up a lot more space to go vertical, unless you design your own boards/parts, I guess.

    DuhSesame said:
    Who knows, with ASi I think they could even have special products for their own niches, it's not impossible to mix an All-in-One with slots.Or if they believe external solutions will be the future (I have a buddy who thinks that) for the consumer world, we'll most likely to see dedicated eGPUs or so.
    Yeah, I think I'm in that camp too. I just don't see Apple competing with Nvidia or AMD on GPUs they'll stick in most of their machines (if any). We'll probably be adding eGPUs or buying a Mac Pro. That said, there may be less of a reason to, for a lot of people who aren't doing stuff like 3D, CAD, or gaming. If Apple accelerates certain key functions like the T2 has done with video encoding, their 'iGPU' might be plenty for a lot more stuff than current iGPUs are.
    It's always the graphics card that makes the noise.  You can put a water cooler on the CPU and be virtually silent (then a lot maintenance work...), but cards were mostly limited by its narrow heatsink.  Throwing a massive heatsink on the card makes the biggest difference.

    That said, I believe the Mac Pro also has the best heatsink for the CPU.

    As for graphics, who knows what Apple is doing there, though I can see how their SoC graphics taking off on laptops, giving decent performance with better cooling.  Two chips sharing a heat pipe isn't as optimal as one with dual cooling systems.  Bigger fans as well.
    edited September 2020
  • Reply 39 of 39
    cgWerkscgWerks Posts: 2,843member
    DuhSesame said:

    It's always the graphics card that makes the noise.  You can put a water cooler on the CPU and be virtually silent (then a lot maintenance work...), but cards were mostly limited by its narrow heatsink.  Throwing a massive heatsink on the card makes the biggest difference. ...
    Good point, stock GPUs are a problem. I guess you'd have to have a re-layout of the system board, too. I guess that explains the problem in the 'stock' hobbyist market.
Sign In or Register to comment.