Frequently asked questions about the 2018 Mac mini RAM, storage, and more [u]

123457»

Comments

  • Reply 121 of 136
    SantiageroSantiagero Posts: 3unconfirmed, member
    I am surprised that one of the commonly asked questions is not about the use case for a home theater. I can only assume that the pokey Intel graphics can handle 4k content, but what about 8K content?? 2019 CES showed a bunch of 8K Tvs coming this spring. Of course I am looking at the future when I will upgrade my current 4k to 8k. Will I be in the same boat with my Mac mini as I am now? And the sound capabiity? for years we Audiophiles have had to buy HD players such as Channel Ds Pure Music to get true Audiophile sound out. Has Apple completely forgotten about sound in its computers? Will we always have to use an external DAC or software to get High Res sound out of our Macs? The Audiophile community wants to know.
  • Reply 122 of 136
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    I am surprised that one of the commonly asked questions is not about the use case for a home theater. I can only assume that the pokey Intel graphics can handle 4k content, but what about 8K content?? 2019 CES showed a bunch of 8K Tvs coming this spring. Of course I am looking at the future when I will upgrade my current 4k to 8k. Will I be in the same boat with my Mac mini as I am now? And the sound capabiity? for years we Audiophiles have had to buy HD players such as Channel Ds Pure Music to get true Audiophile sound out. Has Apple completely forgotten about sound in its computers? Will we always have to use an external DAC or software to get High Res sound out of our Macs? The Audiophile community wants to know.
    It does 4K fine. It doesn't do 8K at all, but you'll find not much does on the computing side at the moment. It's fantastic that there are 8K TVs coming, but content is still a long way away.
  • Reply 123 of 136
    cgWerkscgWerks Posts: 2,952member
    Mike Wuerthele said:
    It does 4K fine. It doesn't do 8K at all, but you'll find not much does on the computing side at the moment. It's fantastic that there are 8K TVs coming, but content is still a long way away.
    Wouldn't you have to have a wall-sized TV for 8K to matter much? At some point, it seems like we're just doing spec jumps for spec (and new TV sales) sake... with the downside of using up our computer performance gains trying to drive them (and bandwidth too).

    Also, just an update on my above comments about the Mac mini and noise... the little Turbo Boost Switcher app to easily turn off Turbo Boost takes care of that, for the most part. So, I have the performance when I need it, but most of the time, just turn it off so I don't have to be bothered by fans. :)
  • Reply 124 of 136
    cgWerks said:
    Mike Wuerthele said:
    It does 4K fine. It doesn't do 8K at all, but you'll find not much does on the computing side at the moment. It's fantastic that there are 8K TVs coming, but content is still a long way away.
    Wouldn't you have to have a wall-sized TV for 8K to matter much? At some point, it seems like we're just doing spec jumps for spec (and new TV sales) sake... with the downside of using up our computer performance gains trying to drive them (and bandwidth too).

    Also, just an update on my above comments about the Mac mini and noise... the little Turbo Boost Switcher app to easily turn off Turbo Boost takes care of that, for the most part. So, I have the performance when I need it, but most of the time, just turn it off so I don't have to be bothered by fans. :)
    You have to separate motion video and text / graphics display when you talk about resolution and mattering much. 

    With motion video the resolution in itself is not going to give you much more -- because you are not focusing on anything for very long.  Lossless representation of the colours and intensity is more important that resolution as you move up when you are talking about motion video.  The difference is noticeable in that case in that with a more lossless representation it looks less blurry than the representation before.  So an 8K TV is not going to necessarily give you that much more -- but often the improvements in resolution and things like HDR go hand in hand as you move forward so not giving you 8K in the future instead of 4K... would not be much difference.  In the early adopter phase though their is a massive cost difference... but us "norms" who are not early adopters benefit by the early adopters funding the improvements.  We have not yet reached the point where you can hang a TV on the wall and not be able to figure if you are looking through a window or a video.  The end goal I would think would be achieved once we have reached that point.  Of course once we reach that point the difference between premium office space with a great view and lesser office space would be worthless... a nightmare for major downtown tower offices :wink: 

    With text and photos, resolution is important.  Yes, it may be difficult to see the individual dots as they are displayed... but you don't look at dots, you look at the effects of it in displaying text and graphics.  The dot representation of text and graphics is a static grid of dots.  The display of fonts and graphics tends to follow a more curved representation.  Higher resolution comes through in crisper text and better curves within pictures.  If resolution were not important we would have stopped at 300dpi laser printers -- but we did not because resolution improvements gave better typesetting and pictures on print.  A good laser printer is more likely to have 1200 or 2400dpi printing.  That in itself at the higher end is going to be more than what you will get on an 8K display.

    The technology is basically the same... just the use cases differ for monitors vs tvs.
    edited January 2019 fastasleepcgWerks
  • Reply 125 of 136
    cgWerkscgWerks Posts: 2,952member
    bkkcanuck said:
    You have to separate motion video and text / graphics display when you talk about resolution and mattering much.  

    With motion video the resolution in itself is not going to give you much more -- because you are not focusing on anything for very long.  Lossless representation of the colours and intensity is more important that resolution as you move up when you are talking about motion video.  The difference is noticeable in that case in that with a more lossless representation it looks less blurry than the representation before.  So an 8K TV is not going to necessarily give you that much more -- but often the improvements in resolution and things like HDR go hand in hand as you move forward so not giving you 8K in the future instead of 4K... would not be much difference.  In the early adopter phase though their is a massive cost difference... but us "norms" who are not early adopters benefit by the early adopters funding the improvements.  We have not yet reached the point where you can hang a TV on the wall and not be able to figure if you are looking through a window or a video.  The end goal I would think would be achieved once we have reached that point.  Of course once we reach that point the difference between premium office space with a great view and lesser office space would be worthless... a nightmare for major downtown tower offices :wink: 

    With text and photos, resolution is important.  Yes, it may be difficult to see the individual dots as they are displayed... but you don't look at dots, you look at the effects of it in displaying text and graphics.  The dot representation of text and graphics is a static grid of dots.  The display of fonts and graphics tends to follow a more curved representation.  Higher resolution comes through in crisper text and better curves within pictures.  If resolution were not important we would have stopped at 300dpi laser printers -- but we did not because resolution improvements gave better typesetting and pictures on print.  A good laser printer is more likely to have 1200 or 2400dpi printing.  That in itself at the higher end is going to be more than what you will get on an 8K display.

    The technology is basically the same... just the use cases differ for monitors vs tvs.
    Yeah, I suppose I agree in an end-game way of driving advancement. I guess I was just thinking in a more reality-way that I can't tell the difference between 1080p and 4k content on our 55" 4k TV when sitting on the couch (in a probably typical type living room arrangement). Maybe, I could tell, though, in other end-effects that my 4k TV has because of the whole mix... good point.

    But, also, as Mike referred to: how do we transfer this stuff? I suppose it will all work out in the end, but it seems for all our higher-quality this and that, we're often going backwards in terms of the quality of the actual content. For example, would a BluRay on my TV look better than a streamed 4k that is compressed to make up for the relatively slow internet connections we're using? Or, now that we've got all these 'Retina' displays, would we be better off (at least in some cases) using the GPU power it takes to drive them to do other things?

    Sometimes I wonder about the tradeoffs and some of the pushing technology and specs, it seems, just for the sake of numbers. Even in the printer example, are those printers really printing at 2400 dpi, or is it closer to 300-600 with some kind of tricks to pretend it's 2400? I'm not sure 300 would be optimal, once we're talking printing a photo or maybe even text, but I'm not sure 2400 is necessary either. And, there has to be some point where going another jump wouldn't bring any meaningful improvement.
  • Reply 126 of 136
    cgWerks said:
    [...] it seems for all our higher-quality this and that, we're often going backwards in terms of the quality of the actual content.
    Bingo. In my living room I have a large, high resolution view of severe compression artifacts. Delivery constraints have a really serious adverse affect on achieving the "looking through a window" objective. I believe we would be better served by using the increased bandwidth 8K requires to reduce compression and/or increase frame rate instead.

    That probably wouldn't be as popular though. Consumers don't seem to WANT their TV to look like a window. Forums and blogs are full of complaints about the "4K soap opera effect." I assume what they're objecting to is the smoother motion provided by higher frame rates. People don't like it. Just ask Peter Jackson about reaction to The Hobbit.
  • Reply 127 of 136
    cgWerks said:
    [...] it seems for all our higher-quality this and that, we're often going backwards in terms of the quality of the actual content.
    Bingo. In my living room I have a large, high resolution view of severe compression artifacts. Delivery constraints have a really serious adverse affect on achieving the "looking through a window" objective. I believe we would be better served by using the increased bandwidth 8K requires to reduce compression and/or increase frame rate instead.

    That probably wouldn't be as popular though. Consumers don't seem to WANT their TV to look like a window. Forums and blogs are full of complaints about the "4K soap opera effect." I assume what they're objecting to is the smoother motion provided by higher frame rates. People don't like it. Just ask Peter Jackson about reaction to The Hobbit.
    Increasing transmission or delivery quality is lagging - yes, but it does not mean that it will always will be.  Total internet bandwidth will likely grow close to 500%  over the next 5 years.  Transmission locally can increase much quicker (if you have fibre) if the demand is there [with the exception of the US which is unfortunately in poor shape for the future growth due to monopolies - and not regulating them as such].  Because of congestion on the Asia Pacific route a lot of large video content is cached locally by ISPs so that it goes across the pacific once and then everyone watching it gets a local copy. 

    Yes, there is a soap opera effect on high frame rate content (not necessarily colour or intensity related), but then that is up to the producers of content to make artistic choices - independent of the display technology.  Just because a display can handle 120 frames/second does not mean the producer has to use it - they can chose 24 frames if they want.  That same high-frame rate though is useful for those that game, VR (small display versions of high resolution) etc.   There are many in the video production industry that for artistic reasons will continue to produce 24/frame per second content.  The technology that is built into the TV to handle this through things like motion blur (globally applied) though are despised by those same people because it is changing what the artist chose.  Your TV should display exactly what the artist has chosen - not make choices for you.

    The soap opera effect is a result of content -- not of your TV display. 
    edited January 2019
  • Reply 128 of 136
    @bkkcanuck ;
    The soap opera effect is a result of content -- not of your TV display.
    I bet most of those complaining have TVs with those fake framerate settings like "Trumotion" that does artificial interpolating to apply a faster frame rate after the fact. Which, of course, looks like garbage to me. I see it all the time on bar TVs.
  • Reply 129 of 136
    So there’s no way you can tether two or three Mac Minis together on the cheap buying base models to double then double again its CPU and RAM using the exact same hardware bought without ram upgrades to processing power as one complete unit built by so there’s no way you can tether two or four Mac Minis together on the cheap buying base models to double its CPU and RAM processing power and then quadruple as one complete unit built by Modularly overtime? 

    This is why I got my hopes up as this can be done with computer building over a longer period of time for those wanting to start out small using the base Mac mini then adding more to the stack using ethernet or tb3 for boosting cpu and ram power without breaking the piggy bank here as the system would do what Nvidia and AMD do using a sli or crossfire hardware like bridge using eithernet or tb3 cables to boost cpu and ram externally here. These things are not cheap as to me this could had replaced the old entry level iMac 21 inch with that 1080p display no one wants as the computer is one but it’s hardware is independent and can be better protected from viruses as it will kill that cpu vs the others in the module stack.  :/
  • Reply 130 of 136
    Why, oh why is this low-end and poorly specced PC so bloody expensive? Apple need to produce a "Welcome to Apple Mac". Surely Apple can stomach a loss-leader to get more onboard.
  • Reply 131 of 136
    thttht Posts: 5,437member
    So there’s no way you can tether two or three Mac Minis together on the cheap buying base models to double then double again its CPU and RAM using the exact same hardware bought without ram upgrades to processing power as one complete unit built by so there’s no way you can tether two or four Mac Minis together on the cheap buying base models to double its CPU and RAM processing power and then quadruple as one complete unit built by Modularly overtime? 

    This is why I got my hopes up as this can be done with computer building over a longer period of time for those wanting to start out small using the base Mac mini then adding more to the stack using ethernet or tb3 for boosting cpu and ram power without breaking the piggy bank here as the system would do what Nvidia and AMD do using a sli or crossfire hardware like bridge using eithernet or tb3 cables to boost cpu and ram externally here. These things are not cheap as to me this could had replaced the old entry level iMac 21 inch with that 1080p display no one wants as the computer is one but it’s hardware is independent and can be better protected from viruses as it will kill that cpu vs the others in the module stack.  :/
    I don’t understand what you are saying here.

    For CPU problems, Apple is promoting that you could stack a few of Mac mini machines together, and use them as a desktop cluster. For GPU problems, you can get an eGPU. The Mac mini’s are 1.4 inches tall. 12 of them will be about as tall as typical ATX motherboard PC box. You’ll need a 12 port router that isn’t going to be small, but probably could be mounted under the desk or used as a monitor stand or stacked vertically.

    If the T2 video hardware is implemented in a lot of software, this setup could have really good Perf/Watt.
    cgWerks
  • Reply 132 of 136
    thttht Posts: 5,437member
    Why, oh why is this low-end and poorly specced PC so bloody expensive? Apple need to produce a "Welcome to Apple Mac". Surely Apple can stomach a loss-leader to get more onboard.
    No. The price of the Mac mini is fine for what it is. There is a premium on it, but it no more than what it used to be.

    Apple could go lower by using slower NAND and a slower CPU, but that’s not their thing.
    fastasleep
  • Reply 133 of 136
    cgWerkscgWerks Posts: 2,952member
    Why, oh why is this low-end and poorly specced PC so bloody expensive? Apple need to produce a "Welcome to Apple Mac". Surely Apple can stomach a loss-leader to get more onboard.
    Because (unfortunately to the entry level Mac people), it isn't a low-end PC/Mac. It's pretty friggin' powerful, and arguably about the best value for performance in the lineup, now. It is solidly prosumer, and not entry level at all. I agree it would be nice if Apple produced an entry-level machine. The low end of the laptops or iMac probably better fit that now, though they are higher priced than the Mac mini of old.

    tht said:
    I don’t understand what you are saying here.

    For CPU problems, Apple is promoting that you could stack a few of Mac mini machines together, and use them as a desktop cluster. For GPU problems, you can get an eGPU. The Mac mini’s are 1.4 inches tall. 12 of them will be about as tall as typical ATX motherboard PC box. You’ll need a 12 port router that isn’t going to be small, but probably could be mounted under the desk or used as a monitor stand or stacked vertically.

    If the T2 video hardware is implemented in a lot of software, this setup could have really good Perf/Watt.
    I was thinking of responding, but didn't understand either. If they are looking for some kind of cluster processing, it will depend on the software used. Apple (or any platform?) doesn't really have a multi-computer distribution system that just breaks up any/all apps across machines.

    For example, I used to use a 3D rendering system by Electric Image, where the rendering engine could use as many computers as you could throw at it. I think some of the video processing software can distribute like that too. Or, database engines, etc. But, it doesn't 'just work' with everything.

    The T2 is awesome for me so far. It wasn't something I cared about when buying it. I even turned off the encryption and such, as it interfered with Boot Camp (so, then I thought the T2 was just baggage.) But, the video encoding performance and low power use is stellar.

    tht said:
    Apple could go lower by using slower NAND and a slower CPU, but that’s not their thing.
    Yeah, I wish Apple would make a low to mid-level 'box' Mac with some entry level parts and expandability. I'm just not sure why they are so incredibly resistant to doing that. It would sell like crazy, I'd think. But, the mini is the next best thing (or the iMac if you need a display, all-in-one, etc.).
  • Reply 134 of 136
    thttht Posts: 5,437member
    cgWerks said:
    tht said:
    I don’t understand what you are saying here.

    For CPU problems, Apple is promoting that you could stack a few of Mac mini machines together, and use them as a desktop cluster. For GPU problems, you can get an eGPU. The Mac mini’s are 1.4 inches tall. 12 of them will be about as tall as typical ATX motherboard PC box. You’ll need a 12 port router that isn’t going to be small, but probably could be mounted under the desk or used as a monitor stand or stacked vertically.

    If the T2 video hardware is implemented in a lot of software, this setup could have really good Perf/Watt.
    I was thinking of responding, but didn't understand either. If they are looking for some kind of cluster processing, it will depend on the software used. Apple (or any platform?) doesn't really have a multi-computer distribution system that just breaks up any/all apps across machines.

    For example, I used to use a 3D rendering system by Electric Image, where the rendering engine could use as many computers as you could throw at it. I think some of the video processing software can distribute like that too. Or, database engines, etc. But, it doesn't 'just work' with everything.

    The T2 is awesome for me so far. It wasn't something I cared about when buying it. I even turned off the encryption and such, as it interfered with Boot Camp (so, then I thought the T2 was just baggage.) But, the video encoding performance and low power use is stellar.
    I’m not sure if there are any easily deployable packages for macOS. For Linux, Unix, whatever Unix derivative, there certainly are, and I imagine anything that runs on Linux or Unix clusters would be a port to macOS or BSD anyways.

    Yes, it is a highly workflow specific. Don’t want to say “application”. There are certain classes of embarrassingly parallel problems that clustering would work perfectly fine for. Video rendering is one of them, but there is a whole world of numerical simulation that Apple media don’t really touch, where compute performance demands are basically infinite. Not sure what the exact tradeoff between having it done on your desk versus haven’t done on compute clusters in a server room is, but there’s always a niche of cases where it does make sense.

    Video work maybe one of them, anything with large and mobile storage needs, where trying to get TBs of data onto a server could be cumbersome.

    cgWerks said:
    tht said:
    Apple could go lower by using slower NAND and a slower CPU, but that’s not their thing.
    Yeah, I wish Apple would make a low to mid-level 'box' Mac with some entry level parts and expandability. I'm just not sure why they are so incredibly resistant to doing that. It would sell like crazy, I'd think. But, the mini is the next best thing (or the iMac if you need a display, all-in-one, etc.).
    The biggest impediment for me is that they won’t go to slower, denser NAND. I’d gladly give up 2 GB/s NAND performance for 400 MB/s NAND performance if it means I can get 4 TB of it, or a Fusion drive variant with fast and slow NAND.

    An Apple TV 4K with macOS and more storage and RAM for $300 to $400 would be quite the interesting device though. It’ll be as fast or faster than many of the trash bin parts that get sold at those price tiers.
    cgWerksfastasleep
  • Reply 135 of 136
    cgWerkscgWerks Posts: 2,952member
    tht said:
    The biggest impediment for me is that they won’t go to slower, denser NAND. I’d gladly give up 2 GB/s NAND performance for 400 MB/s NAND performance if it means I can get 4 TB of it, or a Fusion drive variant with fast and slow NAND.
    For sure, and I would think that would apply to most people. While I love how fast the storage in my Mac mini is, I'd gladly cut the speed in have for 2x more space. I just don't want a super-slow hard drive anymore (partly because of how poorly Finder/Spotlight/open/save/etc. handle it). 50 MB/s is a heck of a lot different than 2000 MB/s, but you're absolutely correct that 300-400 MB/s would be a dream for most people, and way less expensive. I think I just got an email ad today selling 1GB SSD for under $100.
Sign In or Register to comment.