Apple debuts new $5999 Mac Pro with up to 28-core Xeon processors

11516171921

Comments

  • Reply 361 of 420
    cgWerkscgWerks Posts: 2,952member
    melgross said:
    The iMac Is pretty quiet. We have two high end models here at ho e. They are silent. I don’t get where you’re talking about noise. I’ve had louder monitors. The iMac Pro has no cooling problems as far as I’m aware of. I know peop,e and studios with them. They are very happy. No problems. The iMac pro has been selling very well.

    i don’t u defat a d why Apple did t upgrade than machine. Newer chips have greater performance for the same power, and heat consumption. They could have. They shou,d have. 18 months ago, or so, they did lower the pricing. But it’s too old, and I haven’t recommended it for years.
    The primary issue I have with the iMac, is lack of video input (so it's a single-use display). But, heat is an issue for all but the iMac Pro and Mac Pro, if you want a silent office and to run the machine hard. So, I guess it depends on what one means by heat being an issue (though my long-term experiments with heat haven't gone well in terms of reliability either).

    The problem is if you run stuff like rendering, or encode video, or run things like Folding@home, that keep the CPUs up at a sustained use-level, the noise does go up considerably. I haven't owned an iMac since my late-2012 model (i5, quad-core... base model, btw!), which I loved... but it could get a bit noisy, and everything I've read about the newer ones is that the problem has gotten worse. MacBook Pros... crazy noisy. Even my mini can get noisy if I don't disable Turbo Boost, and be careful to run the CPUs at lower levels (by limiting apps access to cores).

    Yeah, as much as I'd have liked a cylinder MP, I just couldn't justify it for the cost. They did drop the cost some, but not nearly enough. A couple of the used-prices for the base model were looking attractive, until the mini got updated.
  • Reply 362 of 420
    DuhSesameDuhSesame Posts: 1,278member
    cgWerks said:
    melgross said:
    The iMac Is pretty quiet. We have two high end models here at ho e. They are silent. I don’t get where you’re talking about noise. I’ve had louder monitors. The iMac Pro has no cooling problems as far as I’m aware of. I know peop,e and studios with them. They are very happy. No problems. The iMac pro has been selling very well.

    i don’t u defat a d why Apple did t upgrade than machine. Newer chips have greater performance for the same power, and heat consumption. They could have. They shou,d have. 18 months ago, or so, they did lower the pricing. But it’s too old, and I haven’t recommended it for years.
    The primary issue I have with the iMac, is lack of video input (so it's a single-use display). But, heat is an issue for all but the iMac Pro and Mac Pro, if you want a silent office and to run the machine hard. So, I guess it depends on what one means by heat being an issue (though my long-term experiments with heat haven't gone well in terms of reliability either).

    The problem is if you run stuff like rendering, or encode video, or run things like Folding@home, that keep the CPUs up at a sustained use-level, the noise does go up considerably. I haven't owned an iMac since my late-2012 model (i5, quad-core... base model, btw!), which I loved... but it could get a bit noisy, and everything I've read about the newer ones is that the problem has gotten worse. MacBook Pros... crazy noisy. Even my mini can get noisy if I don't disable Turbo Boost, and be careful to run the CPUs at lower levels (by limiting apps access to cores).

    Yeah, as much as I'd have liked a cylinder MP, I just couldn't justify it for the cost. They did drop the cost some, but not nearly enough. A couple of the used-prices for the base model were looking attractive, until the mini got updated.
    Perhaps you should download an app that test decibel and see how loud the noise actually is.  Generally, anything under 40dB is acceptable, at 29dB or lower would be impossible to hear.
  • Reply 363 of 420
    melgrossmelgross Posts: 33,508member
    cgWerks said:
    melgross said:
    The iMac Is pretty quiet. We have two high end models here at ho e. They are silent. I don’t get where you’re talking about noise. I’ve had louder monitors. The iMac Pro has no cooling problems as far as I’m aware of. I know peop,e and studios with them. They are very happy. No problems. The iMac pro has been selling very well.

    i don’t u defat a d why Apple did t upgrade than machine. Newer chips have greater performance for the same power, and heat consumption. They could have. They shou,d have. 18 months ago, or so, they did lower the pricing. But it’s too old, and I haven’t recommended it for years.
    The primary issue I have with the iMac, is lack of video input (so it's a single-use display). But, heat is an issue for all but the iMac Pro and Mac Pro, if you want a silent office and to run the machine hard. So, I guess it depends on what one means by heat being an issue (though my long-term experiments with heat haven't gone well in terms of reliability either).

    The problem is if you run stuff like rendering, or encode video, or run things like Folding@home, that keep the CPUs up at a sustained use-level, the noise does go up considerably. I haven't owned an iMac since my late-2012 model (i5, quad-core... base model, btw!), which I loved... but it could get a bit noisy, and everything I've read about the newer ones is that the problem has gotten worse. MacBook Pros... crazy noisy. Even my mini can get noisy if I don't disable Turbo Boost, and be careful to run the CPUs at lower levels (by limiting apps access to cores).

    Yeah, as much as I'd have liked a cylinder MP, I just couldn't justify it for the cost. They did drop the cost some, but not nearly enough. A couple of the used-prices for the base model were looking attractive, until the mini got updated.
    Ok, they’re silent 90% if the time. When pushing it hard, the noise goes up, but not more than a number of other machines I’ve used. Heat itself doesn’t seem to be a problem. I haven’t spoken to anyone who has had a heat related problem. Maybe older machines did. Once in a while I vacuum the vents top and bottom. My 2009 and 2012 MPs only make noise when starting up, and they’re testing the fan control.
  • Reply 364 of 420
    cgWerkscgWerks Posts: 2,952member
    DuhSesame said:
    Perhaps you should download an app that test decibel and see how loud the noise actually is.  Generally, anything under 40dB is acceptable, at 29dB or lower would be impossible to hear.
    Hmm, well maybe I just have good ears (or that scale is off!). My Blackmagic is rated at 18db and I can hear it, a bit. It isn't an annoying kind of noise but a soft whoosh kind of sound. But, the fans in the mini, or Apple's laptops are quite annoying (though not quite as annoying as the fan in my wife's work laptop, when she brings it home).

    BTW, you're probably talking about computer industry acceptability, but 0db is the point at which a healthy human ear can hear something. 40db is like a refrigerator running, which I find annoying (but not as annoying as little, shrill computer fans). :) But, aside from annoying, such sounds show up on condenser mic recordings. (We now live in the city, so noise is relative. The 'noise floor' is higher here, so I don't hear fans as much until they are wound up more. When we lived in our previous home, it was pretty quiet overall.)

    melgross said:
    Ok, they’re silent 90% if the time. When pushing it hard, the noise goes up, but not more than a number of other machines I’ve used. Heat itself doesn’t seem to be a problem. I haven’t spoken to anyone who has had a heat related problem. Maybe older machines did. Once in a while I vacuum the vents top and bottom. My 2009 and 2012 MPs only make noise when starting up, and they’re testing the fan control.
    Yeah, I imagine in general day to day work (office work type stuff) you'd not hear them. But, it doesn't take much to make my mini 'sing'... just start encoding a video for YouTube, or run Minecraft for a bit, etc. I also like to run distributed computing stuff in the background (like Folding@home) when I'm not doing stuff that needs the resources of the computer. With a Mac Pro, or iMac Pro, I assume I could just do any of that stuff and not hear the machine.

    re: heat damage - maybe, but I'm not sure I'm willing to risk it. I had 3 MBPs die prematurely in the 2000s due to doing rendering jobs and such on them. Maybe that was more a factor of the solder used at that point in time (though even my 2007 MBP 'died' early), or the GPU issues, etc. But, I'm sure the heat didn't do them any good.
    edited June 2019
  • Reply 365 of 420
    DuhSesameDuhSesame Posts: 1,278member
    cgWerks said:
    DuhSesame said:
    Perhaps you should download an app that test decibel and see how loud the noise actually is.  Generally, anything under 40dB is acceptable, at 29dB or lower would be impossible to hear.
    Hmm, well maybe I just have good ears (or that scale is off!). My Blackmagic is rated at 18db and I can hear it, a bit. It isn't an annoying kind of noise but a soft whoosh kind of sound. But, the fans in the mini, or Apple's laptops are quite annoying (though not quite as annoying as the fan in my wife's work laptop, when she brings it home).

    BTW, you're probably talking about computer industry acceptability, but 0db is the point at which a healthy human ear can hear something. 40db is like a refrigerator running, which I find annoying (but not as annoying as little, shrill computer fans). :) But, aside from annoying, such sounds show up on condenser mic recordings. (We now live in the city, so noise is relative. The 'noise floor' is higher here, so I don't hear fans as much until they are wound up more. When we lived in our previous home, it was pretty quiet overall.)
    It also depends on the distance between yourself and the computer.  Some fans got an annoying pitch that's easily distinguishable, regardless of how low their ratings are.  30dB is the level where ambient noise was, from what I've remembered.
  • Reply 366 of 420
    melgrossmelgross Posts: 33,508member
    cgWerks said:
    DuhSesame said:
    Perhaps you should download an app that test decibel and see how loud the noise actually is.  Generally, anything under 40dB is acceptable, at 29dB or lower would be impossible to hear.
    Hmm, well maybe I just have good ears (or that scale is off!). My Blackmagic is rated at 18db and I can hear it, a bit. It isn't an annoying kind of noise but a soft whoosh kind of sound. But, the fans in the mini, or Apple's laptops are quite annoying (though not quite as annoying as the fan in my wife's work laptop, when she brings it home).

    BTW, you're probably talking about computer industry acceptability, but 0db is the point at which a healthy human ear can hear something. 40db is like a refrigerator running, which I find annoying (but not as annoying as little, shrill computer fans). :) But, aside from annoying, such sounds show up on condenser mic recordings. (We now live in the city, so noise is relative. The 'noise floor' is higher here, so I don't hear fans as much until they are wound up more. When we lived in our previous home, it was pretty quiet overall.)

    melgross said:
    Ok, they’re silent 90% if the time. When pushing it hard, the noise goes up, but not more than a number of other machines I’ve used. Heat itself doesn’t seem to be a problem. I haven’t spoken to anyone who has had a heat related problem. Maybe older machines did. Once in a while I vacuum the vents top and bottom. My 2009 and 2012 MPs only make noise when starting up, and they’re testing the fan control.
    Yeah, I imagine in general day to day work (office work type stuff) you'd not hear them. But, it doesn't take much to make my mini 'sing'... just start encoding a video for YouTube, or run Minecraft for a bit, etc. I also like to run distributed computing stuff in the background (like Folding@home) when I'm not doing stuff that needs the resources of the computer. With a Mac Pro, or iMac Pro, I assume I could just do any of that stuff and not hear the machine.

    re: heat damage - maybe, but I'm not sure I'm willing to risk it. I had 3 MBPs die prematurely in the 2000s due to doing rendering jobs and such on them. Maybe that was more a factor of the solder used at that point in time (though even my 2007 MBP 'died' early), or the GPU issues, etc. But, I'm sure the heat didn't do them any good.
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
  • Reply 367 of 420
    cgWerkscgWerks Posts: 2,952member
    melgross said:
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
    Yeah, they were Nvidia GPUs, I'm pretty sure. If that's the reason, then I'm glad I don't hold a grudge like Apple, and I'd better be careful about getting on their wrong side. ;)

    re: rendering - That makes the 'Pro' moniker a bit difficult, though. So, what can pros do on Apple's pro laptops? I can't imagine people plugging them in at their desks and doing video work and encoding is any easier on them. They should be designed with enough margin to stay safe, and/or have a cooling system adequate to the task. Otherwise, I guess the package should say, "For professional work.*"        * so long as you don't run them too hard while doing so

    Anyway, the point to me is that Apple only has 2 computers that are designed to be run hard, as the rest of the lineup are kind of just bigger versions of the MBP (iMac, etc.), with similar thermal issues. So, it would be nice to have a more prosumer level Mac, designed to do heavier work.
  • Reply 368 of 420
    DuhSesameDuhSesame Posts: 1,278member
    cgWerks said:
    melgross said:
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
    Yeah, they were Nvidia GPUs, I'm pretty sure. If that's the reason, then I'm glad I don't hold a grudge like Apple, and I'd better be careful about getting on their wrong side. ;)

    re: rendering - That makes the 'Pro' moniker a bit difficult, though. So, what can pros do on Apple's pro laptops? I can't imagine people plugging them in at their desks and doing video work and encoding is any easier on them. They should be designed with enough margin to stay safe, and/or have a cooling system adequate to the task. Otherwise, I guess the package should say, "For professional work.*"        * so long as you don't run them too hard while doing so

    Anyway, the point to me is that Apple only has 2 computers that are designed to be run hard, as the rest of the lineup are kind of just bigger versions of the MBP (iMac, etc.), with similar thermal issues. So, it would be nice to have a more prosumer level Mac, designed to do heavier work.
    Swapping processors will eliminate most of the issues, a large iPad Pro with A12X doesn’t throttle its CPU.
  • Reply 369 of 420
    melgrossmelgross Posts: 33,508member
    cgWerks said:
    melgross said:
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
    Yeah, they were Nvidia GPUs, I'm pretty sure. If that's the reason, then I'm glad I don't hold a grudge like Apple, and I'd better be careful about getting on their wrong side. ;)

    re: rendering - That makes the 'Pro' moniker a bit difficult, though. So, what can pros do on Apple's pro laptops? I can't imagine people plugging them in at their desks and doing video work and encoding is any easier on them. They should be designed with enough margin to stay safe, and/or have a cooling system adequate to the task. Otherwise, I guess the package should say, "For professional work.*"        * so long as you don't run them too hard while doing so

    Anyway, the point to me is that Apple only has 2 computers that are designed to be run hard, as the rest of the lineup are kind of just bigger versions of the MBP (iMac, etc.), with similar thermal issues. So, it would be nice to have a more prosumer level Mac, designed to do heavier work.
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    edited June 2019
  • Reply 370 of 420
    DuhSesameDuhSesame Posts: 1,278member
    melgross said:
    cgWerks said:
    melgross said:
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
    Yeah, they were Nvidia GPUs, I'm pretty sure. If that's the reason, then I'm glad I don't hold a grudge like Apple, and I'd better be careful about getting on their wrong side. ;)

    re: rendering - That makes the 'Pro' moniker a bit difficult, though. So, what can pros do on Apple's pro laptops? I can't imagine people plugging them in at their desks and doing video work and encoding is any easier on them. They should be designed with enough margin to stay safe, and/or have a cooling system adequate to the task. Otherwise, I guess the package should say, "For professional work.*"        * so long as you don't run them too hard while doing so

    Anyway, the point to me is that Apple only has 2 computers that are designed to be run hard, as the rest of the lineup are kind of just bigger versions of the MBP (iMac, etc.), with similar thermal issues. So, it would be nice to have a more prosumer level Mac, designed to do heavier work.
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    While I agree some of your statements, I don’t think that’s the case.  Laptops do run hot even in gaming variants, but that’s a different story when you say they can’t handle it.  I have old laptops which levels off 70-80 Celsius or even 55 when running Linpack (without AVX of course), some reviews of the 2016 MacBook Pros doesn’t show the signs of CPU throttling.  Running hot doesn’t mean it can’t be stable.

    As for the display, while limited by size, a good laptop can still have one that’s better than most other desktop monitors.
  • Reply 371 of 420
    melgrossmelgross Posts: 33,508member
    DuhSesame said:
    melgross said:
    cgWerks said:
    melgross said:
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
    Yeah, they were Nvidia GPUs, I'm pretty sure. If that's the reason, then I'm glad I don't hold a grudge like Apple, and I'd better be careful about getting on their wrong side. ;)

    re: rendering - That makes the 'Pro' moniker a bit difficult, though. So, what can pros do on Apple's pro laptops? I can't imagine people plugging them in at their desks and doing video work and encoding is any easier on them. They should be designed with enough margin to stay safe, and/or have a cooling system adequate to the task. Otherwise, I guess the package should say, "For professional work.*"        * so long as you don't run them too hard while doing so

    Anyway, the point to me is that Apple only has 2 computers that are designed to be run hard, as the rest of the lineup are kind of just bigger versions of the MBP (iMac, etc.), with similar thermal issues. So, it would be nice to have a more prosumer level Mac, designed to do heavier work.
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    While I agree some of your statements, I don’t think that’s the case.  Laptops do run hot even in gaming variants, but that’s a different story when you say they can’t handle it.  I have old laptops which levels off 70-80 Celsius or even 55 when running Linpack (without AVX of course), some reviews of the 2016 MacBook Pros doesn’t show the signs of CPU throttling.  Running hot doesn’t mean it can’t be stable.

    As for the display, while limited by size, a good laptop can still have one that’s better than most other desktop monitors.
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
  • Reply 372 of 420
    cgWerkscgWerks Posts: 2,952member
    DuhSesame said:
    Swapping processors will eliminate most of the issues, a large iPad Pro with A12X doesn’t throttle its CPU.
    Yes, it will certainly be interesting to see how something like a desktop A-series performs when that comes along. I'm sure it will use more power (and require more cooling) than an iPad Pro, though, but probably not as much as current Intel CPUs for sure.

    melgross said:
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    Fair points, but aren't these laptops being used like that far more today than they were a decade ago? This whole idea of bringing your MBP to work/home and plugging into monitors and docks, storage arrays, even GPUs and then doing fairly demanding work, seems almost the primary use-case. While I'm sure most of them aren't running them 24x7, it seems like it wouldn't be that odd for them to run heavy several hours each day in chunks of time (ie: encoding a video, compiling an app, etc.).

    I'm not sure about an iMac, as it has been too long now, and I had the 21" model. I could run it harder than I can run my mini, though, before the fans really kicked in hard, I think. That said, I absolutely love my mini, aside from its thermal characteristic (which I have mostly under control by disabling Turbo Boost). It's just sad that I have this nice piece of hardware (the Blackmagic eGPU which is quite silent), and then the mini goes and ruins that. :)

    But, I'm not sure I could justify spending $6k (more like $8k in Canada), to have a effectively similar machine (for my tasks and workflow), but quiet. So, I wish there were something in-between.

    DuhSesame said:
    ... some reviews of the 2016 MacBook Pros doesn’t show the signs of CPU throttling.  Running hot doesn’t mean it can’t be stable.
    Oh yeah, I'm sure they are stable... at least until they aren't. ;)  My mini runs fine at 93-95C 24x7. The question is more if and what damage is being done on the long-term by that heat (components, boards, fans, etc.).

    My past MBPs ran really well too, until a fan would die (in one case), or the GPU went wonky (Nvidia, but one repaired by Apple, the other one not), or they became unstable (I think one of them is still in use by the guy I sold it to, but it has to be on a laptop riser with a little fan blowing under it, otherwise it freezes. But, runs great with that setup - kind of a desktop-laptop. He's using it for school work, and I sold it really cheap.)

    melgross said:
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
    I guess I'm saying there should be a difference between 'better designed for' or 'more optimal' than 'don't do it much, if at all'. It's one thing if you do too much gaming or encode too many YouTube videos with your MacBook Air, and it gets damaged. It is another thing, at least IMO, if it is a MacBook Pro. Sure, if its primary purpose is to render stuff, I'll come out ahead with the 'server' version or desktop anyway, as if anything, it will do it faster. But, a pro laptop should be able to do it without damaging itself. It should just appropriately throttle down to keep the temps safe (not just for the immediate time, but with longevity in consideration).
  • Reply 373 of 420
    DuhSesameDuhSesame Posts: 1,278member
    melgross said:
    DuhSesame said:
    melgross said:
    cgWerks said:
    melgross said:
    Those GPU issues were the fault of Nvidia. Perhaps that’s one reason Apple soured on them. Generally, rendering more that the dailies on a laptop isn’t recommended.
    Yeah, they were Nvidia GPUs, I'm pretty sure. If that's the reason, then I'm glad I don't hold a grudge like Apple, and I'd better be careful about getting on their wrong side. ;)

    re: rendering - That makes the 'Pro' moniker a bit difficult, though. So, what can pros do on Apple's pro laptops? I can't imagine people plugging them in at their desks and doing video work and encoding is any easier on them. They should be designed with enough margin to stay safe, and/or have a cooling system adequate to the task. Otherwise, I guess the package should say, "For professional work.*"        * so long as you don't run them too hard while doing so

    Anyway, the point to me is that Apple only has 2 computers that are designed to be run hard, as the rest of the lineup are kind of just bigger versions of the MBP (iMac, etc.), with similar thermal issues. So, it would be nice to have a more prosumer level Mac, designed to do heavier work.
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    While I agree some of your statements, I don’t think that’s the case.  Laptops do run hot even in gaming variants, but that’s a different story when you say they can’t handle it.  I have old laptops which levels off 70-80 Celsius or even 55 when running Linpack (without AVX of course), some reviews of the 2016 MacBook Pros doesn’t show the signs of CPU throttling.  Running hot doesn’t mean it can’t be stable.

    As for the display, while limited by size, a good laptop can still have one that’s better than most other desktop monitors.
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
    True, laptops are replacements for mainstream desktops, not Xeon workstations.  Maybe Eurocom are exceptions, but...
  • Reply 374 of 420
    melgrossmelgross Posts: 33,508member
    cgWerks said:
    melgross said:
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    Fair points, but aren't these laptops being used like that far more today than they were a decade ago? This whole idea of bringing your MBP to work/home and plugging into monitors and docks, storage arrays, even GPUs and then doing fairly demanding work, seems almost the primary use-case. While I'm sure most of them aren't running them 24x7, it seems like it wouldn't be that odd for them to run heavy several hours each day in chunks of time (ie: encoding a video, compiling an app, etc.).

    I'm not sure about an iMac, as it has been too long now, and I had the 21" model. I could run it harder than I can run my mini, though, before the fans really kicked in hard, I think. That said, I absolutely love my mini, aside from its thermal characteristic (which I have mostly under control by disabling Turbo Boost). It's just sad that I have this nice piece of hardware (the Blackmagic eGPU which is quite silent), and then the mini goes and ruins that. :)

    But, I'm not sure I could justify spending $6k (more like $8k in Canada), to have a effectively similar machine (for my tasks and workflow), but quiet. So, I wish there were something in-between.
    melgross said:
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
    I guess I'm saying there should be a difference between 'better designed for' or 'more optimal' than 'don't do it much, if at all'. It's one thing if you do too much gaming or encode too many YouTube videos with your MacBook Air, and it gets damaged. It is another thing, at least IMO, if it is a MacBook Pro. Sure, if its primary purpose is to render stuff, I'll come out ahead with the 'server' version or desktop anyway, as if anything, it will do it faster. But, a pro laptop should be able to do it without damaging itself. It should just appropriately throttle down to keep the temps safe (not just for the immediate time, but with longevity in consideration).
    When I’m saying heavy rendering, I’m not talking about 1080p standard fare. For Tv work. These days, a Macbook Pro should be able to handle that, if it’s plugged into power. But heavy means 4K or for major motion pictures, up to 8k. No laptop is presently designed to do that for hours at a clip. I guess in several years. But I’m including Apple’s new Macbook Pro with 8 cores. That will do it much better—for a time. But still, a laptop just doesn’t have the thermal capacity to do that for a long term project. That’s why proxies are used. That’s what this new Mac Pro is designed for. If you watched the demo in the WWDC segment, you’d would have seen what really heavy workloads are. It’s something we could only dream about.

    it won’t damage itself. But yes, it will throttle down after some time, and then it’s just slow. But take the new Mac Pro and load it with 16 or more cores, and a whopping amount of RAM and a dual graphics card, or two, and you’ll plow through it at ten times the speed. Time is money.

    back when, we would render a 23 minute Tv show over a weekend using two machines. Now my iPad Pro can render it in real-time, depending on how many effects are applied. But that’s 1080p, the old machines were working on 720 x 485. So think on that. I can even do 4K. Amazing. But not for too long, as the machine starts to get hot.

    but now render 8k, four times as much as 4K. Add color control, lighting compensation, black level, some effects, and bang, you’re down to 1 frame every two seconds, if at all. The newest Macbook Pro with 8 cores will do it, just slowly.

    look, it’s a matter of the level you’re working at. If you don’t need it, don’t buy it. But equipment is a capital purchase. It’s depreciated, at least. If it can’t pay for itself in 3 to 6 months, at most, then it’s not worth it. But if it can, you can make a lot more money. And it’s all about the money, and the ROI. For high end work, this will pay for itself in a high config in a short period. I’m not sure then lowest config is worth it though, and I was considering that, to start. But now, after reading a lot about it, and going to some pro channels on YouTube, I think that 12 cores is the best place for me.

    and, by the way,  my situation is different from most considering as I’m retired. Fortunately, I can do this. I still have work to put through this that makes it worthwhile, time wise, even thought what I do these days isn’t paid.
    edited June 2019
  • Reply 375 of 420
    DuhSesameDuhSesame Posts: 1,278member
    cgWerks said:
    DuhSesame said:
    Swapping processors will eliminate most of the issues, a large iPad Pro with A12X doesn’t throttle its CPU.
    Yes, it will certainly be interesting to see how something like a desktop A-series performs when that comes along. I'm sure it will use more power (and require more cooling) than an iPad Pro, though, but probably not as much as current Intel CPUs for sure.

    melgross said:
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    Fair points, but aren't these laptops being used like that far more today than they were a decade ago? This whole idea of bringing your MBP to work/home and plugging into monitors and docks, storage arrays, even GPUs and then doing fairly demanding work, seems almost the primary use-case. While I'm sure most of them aren't running them 24x7, it seems like it wouldn't be that odd for them to run heavy several hours each day in chunks of time (ie: encoding a video, compiling an app, etc.).

    I'm not sure about an iMac, as it has been too long now, and I had the 21" model. I could run it harder than I can run my mini, though, before the fans really kicked in hard, I think. That said, I absolutely love my mini, aside from its thermal characteristic (which I have mostly under control by disabling Turbo Boost). It's just sad that I have this nice piece of hardware (the Blackmagic eGPU which is quite silent), and then the mini goes and ruins that. :)

    But, I'm not sure I could justify spending $6k (more like $8k in Canada), to have a effectively similar machine (for my tasks and workflow), but quiet. So, I wish there were something in-between.

    DuhSesame said:
    ... some reviews of the 2016 MacBook Pros doesn’t show the signs of CPU throttling.  Running hot doesn’t mean it can’t be stable.
    Oh yeah, I'm sure they are stable... at least until they aren't. ;)  My mini runs fine at 93-95C 24x7. The question is more if and what damage is being done on the long-term by that heat (components, boards, fans, etc.).

    My past MBPs ran really well too, until a fan would die (in one case), or the GPU went wonky (Nvidia, but one repaired by Apple, the other one not), or they became unstable (I think one of them is still in use by the guy I sold it to, but it has to be on a laptop riser with a little fan blowing under it, otherwise it freezes. But, runs great with that setup - kind of a desktop-laptop. He's using it for school work, and I sold it really cheap.)

    melgross said:
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
    I guess I'm saying there should be a difference between 'better designed for' or 'more optimal' than 'don't do it much, if at all'. It's one thing if you do too much gaming or encode too many YouTube videos with your MacBook Air, and it gets damaged. It is another thing, at least IMO, if it is a MacBook Pro. Sure, if its primary purpose is to render stuff, I'll come out ahead with the 'server' version or desktop anyway, as if anything, it will do it faster. But, a pro laptop should be able to do it without damaging itself. It should just appropriately throttle down to keep the temps safe (not just for the immediate time, but with longevity in consideration).
    Well, usually other things happened before the chip dies from heat, unless the chip itself have flaws.  
    Your thermal pastes will dry, the heat sink is full of dust, or your mechanical hard drive will become unstable, but your chip could survive more than a decade even with that much of heat.  On top of that, thermal throttling would kick in to prevent high temperatures for a while.

    Speaking of A12X, it’s comparable to Ultrabooks  i7 in term of performances, but the latter often required twice or triple the power.  Most ultrabooks does end up throttling, so...
    edited June 2019
  • Reply 376 of 420
    DuhSesameDuhSesame Posts: 1,278member
    melgross said:
    cgWerks said:
    melgross said:
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    Fair points, but aren't these laptops being used like that far more today than they were a decade ago? This whole idea of bringing your MBP to work/home and plugging into monitors and docks, storage arrays, even GPUs and then doing fairly demanding work, seems almost the primary use-case. While I'm sure most of them aren't running them 24x7, it seems like it wouldn't be that odd for them to run heavy several hours each day in chunks of time (ie: encoding a video, compiling an app, etc.).

    I'm not sure about an iMac, as it has been too long now, and I had the 21" model. I could run it harder than I can run my mini, though, before the fans really kicked in hard, I think. That said, I absolutely love my mini, aside from its thermal characteristic (which I have mostly under control by disabling Turbo Boost). It's just sad that I have this nice piece of hardware (the Blackmagic eGPU which is quite silent), and then the mini goes and ruins that. :)

    But, I'm not sure I could justify spending $6k (more like $8k in Canada), to have a effectively similar machine (for my tasks and workflow), but quiet. So, I wish there were something in-between.
    melgross said:
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
    I guess I'm saying there should be a difference between 'better designed for' or 'more optimal' than 'don't do it much, if at all'. It's one thing if you do too much gaming or encode too many YouTube videos with your MacBook Air, and it gets damaged. It is another thing, at least IMO, if it is a MacBook Pro. Sure, if its primary purpose is to render stuff, I'll come out ahead with the 'server' version or desktop anyway, as if anything, it will do it faster. But, a pro laptop should be able to do it without damaging itself. It should just appropriately throttle down to keep the temps safe (not just for the immediate time, but with longevity in consideration).
    When I’m saying heavy rendering, I’m not talking about 1080p standard fare. For Tv work. These days, a Macbook Pro should be able to handle that, if it’s plugged into power. But heavy means 4K or for major motion pictures, up to 8k. No laptop is presently designed to do that for hours at a clip. I guess in several years. But I’m including Apple’s new Macbook Pro with 8 cores. That will do it much better—for a time. But still, a laptop just doesn’t have the thermal capacity to do that for a long term project. That’s why proxies are used. That’s what this new Mac Pro is designed for. If you watched the demo in the WWDC segment, you’d would have seen what really heavy workloads are. It’s something we could only dream about.

    it won’t damage itself. But yes, it will throttle down after some time, and then it’s just slow. But take the new Mac Pro and load it with 16 or more cores, and a whopping amount of RAM and a dual graphics card, or two, and you’ll plow through it at ten times the speed. Time is money.

    back when, we would render a 23 minute Tv show over a weekend using two machines. Now my iPad Pro can render it in real-time, depending on how many effects are applied. But that’s 1080p, the old machines were working on 720 x 485. So think on that. I can even do 4K. Amazing. But not for too long, as the machine starts to get hot.

    but now render 8k, four times as much as 4K. Add color control, lighting compensation, black level, some effects, and bang, you’re down to 1 frame every two seconds, if at all. The newest Macbook Pro with 8 cores will do it, just slowly.

    look, it’s a matter of the level you’re working at. If you don’t need it, don’t buy it. But equipment is a capital purchase. It’s depreciated, at least. If it can’t pay for itself in 3 to 6 months, at most, then it’s not worth it. But if it can, you can make a lot more money. And it’s all about the money, and the ROI. For high end work, this will pay for itself in a high config in a short period. I’m not sure then lowest config is worth it though, and I was considering that, to start. But now, after reading a lot about it, and going to some pro channels on YouTube, I think that 12 cores is the best place for me.

    and, by the way,  my situation is different from most considering as I’m retired. Fortunately, I can do this. I still have work to put through this that makes it worthwhile, time wise, even thought what I do these days isn’t paid.
    It would be foolish to use laptops to compare 28-core workstations, but that’s workstations.  To people who cut YouTube videos, programming softwares, non-so-serious picture editing or productivity softwares like 2D-level CAD or designing tools, a pro laptop should be adequate.

    Simply put, if desktop processors like i5/i7/i9 (non-HEDT) can handle, it should also by pro laptops.
    edited June 2019 cgWerks
  • Reply 377 of 420
    cgWerkscgWerks Posts: 2,952member
    melgross said:
    it won’t damage itself. But yes, it will throttle down after some time, and then it’s just slow. But take the new Mac Pro and load it with 16 or more cores, and a whopping amount of RAM and a dual graphics card, or two, and you’ll plow through it at ten times the speed. Time is money.
    I guess that's what I'm questioning/arguing. Will it damage itself? I get the rest of the stuff you're saying. I just think that the thermal designs should be such to protect the hardware, both near and long-term. It seems they are more tuning them to hit performance specs, screw the safe margins, and god forbid it hamper the slim designs.

    And, certainly, you'll want a MP for the heavy-duty stuff. When I was doing that stuff, I needed a laptop mostly, and unfortunately did too much heavy stuff on it (and also used some desktops too... I just included the laptop in my 'render farm', and or got a bit too interested in folding@home, etc.)

    melgross said:
    look, it’s a matter of the level you’re working at. If you don’t need it, don’t buy it. But equipment is a capital purchase. It’s depreciated, at least. If it can’t pay for itself in 3 to 6 months, at most, then it’s not worth it.
    Yes, I certainly get this. I was doing it more at a prosumer level (and as I mentioned above, playing with folding@home). I needed the laptop anyway for the work I did (it had to be portable), but was also doing rendering work for some projects on the side. I just didn't expect to run into that many issues from running the machine hard.

    I think my concern these days, is that you really don't need to do much at all to get the machine sweating hard. I'm talking the kind of stuff even a typical consumer might do, let along crunching 5k and 8k video. The fans start ramping up after working about 1.5 of my 6 cores, for crying out loud (2018 Mac mini i7). With Turbo Boost off, I can get to about 40% CPU utilization w/o the fans going crazy (so that's where I run it most of the time).

    DuhSesame said:
    Well, usually other things happened before the chip dies from heat, unless the chip itself have flaws.  
    Your thermal pastes will dry, the heat sink is full of dust, or your mechanical hard drive will become unstable, but your chip could survive more than a decade even with that much of heat.  On top of that, thermal throttling would kick in to prevent high temperatures for a while.
    Yes, for sure. It was those other things that got damaged in my past experience. The CPU was fine, but that wasn't any more comforting.

    DuhSesame said:
    It would be foolish to use laptops to compare 28-core workstations, but that’s workstations.  To people who cut YouTube videos, programming softwares, non-so-serious picture editing or productivity softwares like 2D-level CAD or designing tools, a pro laptop should be adequate.

    Simply put, if desktop processors like i5/i7/i9 (non-HEDT) can handle, it should also by pro laptops.
    Yeah, I think that is my point. You shouldn't need a workstation that starts at $6k to do those kind of things. One would think *any* pro model (and even the non-pro, maybe) machines in a lineup could handle that stuff. And they can, but they run crazy hot. Maybe in 2019, that isn't as much an issue anymore as it was mid-2000s, but I'm kind of gun-shy now.

    But, more to the point... why can't Apple produce a prosumer i5/i7/i9 machine that CAN take that kind of workflow w/o thermal issues, noisy fans, etc.? Well, they certainly could. So, I guess the question is, why do they seem to refuse to do so?
  • Reply 378 of 420
    DuhSesameDuhSesame Posts: 1,278member
    cgWerks said:
    melgross said:
    it won’t damage itself. But yes, it will throttle down after some time, and then it’s just slow. But take the new Mac Pro and load it with 16 or more cores, and a whopping amount of RAM and a dual graphics card, or two, and you’ll plow through it at ten times the speed. Time is money.
    I guess that's what I'm questioning/arguing. Will it damage itself? I get the rest of the stuff you're saying. I just think that the thermal designs should be such to protect the hardware, both near and long-term. It seems they are more tuning them to hit performance specs, screw the safe margins, and god forbid it hamper the slim designs.

    And, certainly, you'll want a MP for the heavy-duty stuff. When I was doing that stuff, I needed a laptop mostly, and unfortunately did too much heavy stuff on it (and also used some desktops too... I just included the laptop in my 'render farm', and or got a bit too interested in folding@home, etc.)

    melgross said:
    look, it’s a matter of the level you’re working at. If you don’t need it, don’t buy it. But equipment is a capital purchase. It’s depreciated, at least. If it can’t pay for itself in 3 to 6 months, at most, then it’s not worth it.
    Yes, I certainly get this. I was doing it more at a prosumer level (and as I mentioned above, playing with folding@home). I needed the laptop anyway for the work I did (it had to be portable), but was also doing rendering work for some projects on the side. I just didn't expect to run into that many issues from running the machine hard.

    I think my concern these days, is that you really don't need to do much at all to get the machine sweating hard. I'm talking the kind of stuff even a typical consumer might do, let along crunching 5k and 8k video. The fans start ramping up after working about 1.5 of my 6 cores, for crying out loud (2018 Mac mini i7). With Turbo Boost off, I can get to about 40% CPU utilization w/o the fans going crazy (so that's where I run it most of the time).

    DuhSesame said:
    Well, usually other things happened before the chip dies from heat, unless the chip itself have flaws.  
    Your thermal pastes will dry, the heat sink is full of dust, or your mechanical hard drive will become unstable, but your chip could survive more than a decade even with that much of heat.  On top of that, thermal throttling would kick in to prevent high temperatures for a while.
    Yes, for sure. It was those other things that got damaged in my past experience. The CPU was fine, but that wasn't any more comforting.

    DuhSesame said:
    It would be foolish to use laptops to compare 28-core workstations, but that’s workstations.  To people who cut YouTube videos, programming softwares, non-so-serious picture editing or productivity softwares like 2D-level CAD or designing tools, a pro laptop should be adequate.

    Simply put, if desktop processors like i5/i7/i9 (non-HEDT) can handle, it should also by pro laptops.
    Yeah, I think that is my point. You shouldn't need a workstation that starts at $6k to do those kind of things. One would think *any* pro model (and even the non-pro, maybe) machines in a lineup could handle that stuff. And they can, but they run crazy hot. Maybe in 2019, that isn't as much an issue anymore as it was mid-2000s, but I'm kind of gun-shy now.

    But, more to the point... why can't Apple produce a prosumer i5/i7/i9 machine that CAN take that kind of workflow w/o thermal issues, noisy fans, etc.? Well, they certainly could. So, I guess the question is, why do they seem to refuse to do so?
    Since I don’t know which model you have and what problem it is, it’s hard to say what caused the issue.  Generally speaking, only CPU and GPU need to be properly cooled (include SSDs because now they throttled too), where rest of the chips & components can run by itself.  Both Retina and Touch Bar vent air from both side, so there is airflow in the entire machine (as well the SSD), something that older models don’t do.  But of course, the i7/i9 is way too hot for the heat pipe.


    Here’s my thought, I think they need to draw some lines, like keeping the CPU within 70-80 degree Celsius when stress tested, while make sure everything will get some airflow too, then think about how slim it will actually be.

    70-80 in stress test is pretty safe because 99.9% of softwares aren’t going to push the CPU this hard, not even encoding.

    As for the dust and thermal paste, that’s just the nature of actively-cooled and silicon gel.  You’ll open it up sometimes to clean it, unless you’re using passively-cooled system, like iPads or MacBook (which still need to re-paste years later).
    edited June 2019
  • Reply 379 of 420
    melgrossmelgross Posts: 33,508member
    DuhSesame said:
    melgross said:
    cgWerks said:
    melgross said:
    It’s never been recommended, whether Mac or Wintel, to run heavy rendering on a laptop. Professional doesn’t always mean that you can do that. Laptops aren’t the same as workstations. They’re meant for convenience in the field, to get a look at the days shoots. So we used to run proxies and edit for our edit control sequences, which we would then transfer to the “big” studio machines where the actual files would be rendered.

    a laptop, no matter where it comes from, uses laptop chips (except for a couple of very heavy gaming machines), runs off a battery, has poor cooling, a small screen,  not always of proper color quality,  when compared to a workstation designed for this very purpose. A 25” iMac is much better for this. I can’t speak to the new Mini, because I haven’t used one, but from what I read, it’s pretty good, if connected to a good external GPU.
    Fair points, but aren't these laptops being used like that far more today than they were a decade ago? This whole idea of bringing your MBP to work/home and plugging into monitors and docks, storage arrays, even GPUs and then doing fairly demanding work, seems almost the primary use-case. While I'm sure most of them aren't running them 24x7, it seems like it wouldn't be that odd for them to run heavy several hours each day in chunks of time (ie: encoding a video, compiling an app, etc.).

    I'm not sure about an iMac, as it has been too long now, and I had the 21" model. I could run it harder than I can run my mini, though, before the fans really kicked in hard, I think. That said, I absolutely love my mini, aside from its thermal characteristic (which I have mostly under control by disabling Turbo Boost). It's just sad that I have this nice piece of hardware (the Blackmagic eGPU which is quite silent), and then the mini goes and ruins that. :)

    But, I'm not sure I could justify spending $6k (more like $8k in Canada), to have a effectively similar machine (for my tasks and workflow), but quiet. So, I wish there were something in-between.
    melgross said:
    Im not saying that it can’t be stable. I’m saying that it’s not designed for heavy rendering tasks. And that’s true. If it were, no studio would be spending hundreds of thousands on workstations and monitors for this purpose.
    I guess I'm saying there should be a difference between 'better designed for' or 'more optimal' than 'don't do it much, if at all'. It's one thing if you do too much gaming or encode too many YouTube videos with your MacBook Air, and it gets damaged. It is another thing, at least IMO, if it is a MacBook Pro. Sure, if its primary purpose is to render stuff, I'll come out ahead with the 'server' version or desktop anyway, as if anything, it will do it faster. But, a pro laptop should be able to do it without damaging itself. It should just appropriately throttle down to keep the temps safe (not just for the immediate time, but with longevity in consideration).
    When I’m saying heavy rendering, I’m not talking about 1080p standard fare. For Tv work. These days, a Macbook Pro should be able to handle that, if it’s plugged into power. But heavy means 4K or for major motion pictures, up to 8k. No laptop is presently designed to do that for hours at a clip. I guess in several years. But I’m including Apple’s new Macbook Pro with 8 cores. That will do it much better—for a time. But still, a laptop just doesn’t have the thermal capacity to do that for a long term project. That’s why proxies are used. That’s what this new Mac Pro is designed for. If you watched the demo in the WWDC segment, you’d would have seen what really heavy workloads are. It’s something we could only dream about.

    it won’t damage itself. But yes, it will throttle down after some time, and then it’s just slow. But take the new Mac Pro and load it with 16 or more cores, and a whopping amount of RAM and a dual graphics card, or two, and you’ll plow through it at ten times the speed. Time is money.

    back when, we would render a 23 minute Tv show over a weekend using two machines. Now my iPad Pro can render it in real-time, depending on how many effects are applied. But that’s 1080p, the old machines were working on 720 x 485. So think on that. I can even do 4K. Amazing. But not for too long, as the machine starts to get hot.

    but now render 8k, four times as much as 4K. Add color control, lighting compensation, black level, some effects, and bang, you’re down to 1 frame every two seconds, if at all. The newest Macbook Pro with 8 cores will do it, just slowly.

    look, it’s a matter of the level you’re working at. If you don’t need it, don’t buy it. But equipment is a capital purchase. It’s depreciated, at least. If it can’t pay for itself in 3 to 6 months, at most, then it’s not worth it. But if it can, you can make a lot more money. And it’s all about the money, and the ROI. For high end work, this will pay for itself in a high config in a short period. I’m not sure then lowest config is worth it though, and I was considering that, to start. But now, after reading a lot about it, and going to some pro channels on YouTube, I think that 12 cores is the best place for me.

    and, by the way,  my situation is different from most considering as I’m retired. Fortunately, I can do this. I still have work to put through this that makes it worthwhile, time wise, even thought what I do these days isn’t paid.
    It would be foolish to use laptops to compare 28-core workstations, but that’s workstations.  To people who cut YouTube videos, programming softwares, non-so-serious picture editing or productivity softwares like 2D-level CAD or designing tools, a pro laptop should be adequate.

    Simply put, if desktop processors like i5/i7/i9 (non-HEDT) can handle, it should also by pro laptops.
    People who do YouTube videos will not have a problem. I thought I made it clear that standard 1080p isn’t a problem anymore. This workstation isn’t made for that. It’s made for real high end work. Macbook Pros aren’t fine for that. Since these discussions are revolving around the new Mac Pro, this has to be brought into the conversation. Complaining that something g isn’t up to the task because it gets hot, throttles, or is just otherwise slow, simply says that that’s not the proper machine for the job, and complaining about it doesn’t change that, a different machine is then needed, and depending on the work involved, it can be a new model Mini, a decked out 25” iMac, and iMac pro, or the new Mac Pro.

    just remember that these laptop chips, while they may have the same top level designations as the desktop models, are NOT the desktop models, and don’t perform as well.
  • Reply 380 of 420
    melgrossmelgross Posts: 33,508member
    cgWerks said:
    melgross said:
    it won’t damage itself. But yes, it will throttle down after some time, and then it’s just slow. But take the new Mac Pro and load it with 16 or more cores, and a whopping amount of RAM and a dual graphics card, or two, and you’ll plow through it at ten times the speed. Time is money.
    I guess that's what I'm questioning/arguing. Will it damage itself? I get the rest of the stuff you're saying. I just think that the thermal designs should be such to protect the hardware, both near and long-term. It seems they are more tuning them to hit performance specs, screw the safe margins, and god forbid it hamper the slim designs.

    And, certainly, you'll want a MP for the heavy-duty stuff. When I was doing that stuff, I needed a laptop mostly, and unfortunately did too much heavy stuff on it (and also used some desktops too... I just included the laptop in my 'render farm', and or got a bit too interested in folding@home, etc.)

    melgross said:
    look, it’s a matter of the level you’re working at. If you don’t need it, don’t buy it. But equipment is a capital purchase. It’s depreciated, at least. If it can’t pay for itself in 3 to 6 months, at most, then it’s not worth it.
    Yes, I certainly get this. I was doing it more at a prosumer level (and as I mentioned above, playing with folding@home). I needed the laptop anyway for the work I did (it had to be portable), but was also doing rendering work for some projects on the side. I just didn't expect to run into that many issues from running the machine hard.

    I think my concern these days, is that you really don't need to do much at all to get the machine sweating hard. I'm talking the kind of stuff even a typical consumer might do, let along crunching 5k and 8k video. The fans start ramping up after working about 1.5 of my 6 cores, for crying out loud (2018 Mac mini i7). With Turbo Boost off, I can get to about 40% CPU utilization w/o the fans going crazy (so that's where I run it most of the time).

    DuhSesame said:
    Well, usually other things happened before the chip dies from heat, unless the chip itself have flaws.  
    Your thermal pastes will dry, the heat sink is full of dust, or your mechanical hard drive will become unstable, but your chip could survive more than a decade even with that much of heat.  On top of that, thermal throttling would kick in to prevent high temperatures for a while.
    Yes, for sure. It was those other things that got damaged in my past experience. The CPU was fine, but that wasn't any more comforting.

    DuhSesame said:
    It would be foolish to use laptops to compare 28-core workstations, but that’s workstations.  To people who cut YouTube videos, programming softwares, non-so-serious picture editing or productivity softwares like 2D-level CAD or designing tools, a pro laptop should be adequate.

    Simply put, if desktop processors like i5/i7/i9 (non-HEDT) can handle, it should also by pro laptops.
    Yeah, I think that is my point. You shouldn't need a workstation that starts at $6k to do those kind of things. One would think *any* pro model (and even the non-pro, maybe) machines in a lineup could handle that stuff. And they can, but they run crazy hot. Maybe in 2019, that isn't as much an issue anymore as it was mid-2000s, but I'm kind of gun-shy now.

    But, more to the point... why can't Apple produce a prosumer i5/i7/i9 machine that CAN take that kind of workflow w/o thermal issues, noisy fans, etc.? Well, they certainly could. So, I guess the question is, why do they seem to refuse to do so?
    Unless you’re outside, in the sun, on a day that’s hot, there shouldn’t be a problem. Too many people complain when their laptops get hot. Laptops have always gotten hot, particularly when pushed hard for a while. The batteries get hot, the RAM gets hot. The CPU gets hot. The GPU gets hot. In the “old days”, the screen would get hot too.But again, laptop chips are not Desktop chips. The don’t have the power, they don’t have the cache. And most of all, they don’t have the cooling. A laptop is NOT the equivalent of a Desktop. No matter how much some people think a laptop can do everything a Desktop can do, they can’t. Rendering is slower.

    The screen is smaller, and often not color calibrated. If you don’t need a calibrated screen, because you don’t care, or need proper color, then that’s an example of the differences between why some people can use a laptop, while others need a Desktop. My calibration hardware costs half as much as a loaded Macbook Pro. Are you using something to do that? If so, at an amateur level, or a pro level? Can you properly edit outdoors? No, you can’t. Can you in a bright room? No, you can’t.

    some people are doing work at levels that make me cringe, and I’m not the only one. But, by all means, think you’re doing Pro work. That’s the general “you” I’m using. I’m not referring to you, specifically. I understand where you’re coming from.
Sign In or Register to comment.