2014 Mac mini Wishlist

1679111277

Comments

  • Reply 161 of 1528

    Quote:

    Originally Posted by wizard69 View Post





    You and me both. Apples current pricing scheme might be viable as a pro screen but my god really decent monitors can be had for $250 dollars these days. Most Mini users can get by fine with one of those monitors.

    Even those numbers are high. They need a solution that is well under $350. Mind you this is just a very basic monitor. Part of the problem with Apples one solution is that the monitor isn't just a monitor, it is also a hub for USB and other ports.


    I was just trying not to make it cheaper than it ever will be. Let's face it: Apple's in it's own class, and it can charge whatever it wants and most of us will still buy some type of their product. You get what you pay for - that's my rationalization for buying a $1500 computer when you can get one for $300. And it isn't just a rationalization either, it's true. Apple just makes the best products and you can't even compete, and they can charge a lot because it is worth it, and there is no competitor to compete price-wise. 

  • Reply 162 of 1528
    wizard69wizard69 Posts: 13,377member
    I was just trying not to make it cheaper than it ever will be. Let's face it: Apple's in it's own class, and it can charge whatever it wants and most of us will still buy some type of their product.
    In the end though is Apple harming itself with these high prices? Lets face it, I do expect to pay a bit. More for Apple products because many times they are worth it. But when you look at some of Apples video monitor solutions it becomes a bit unbearable. This even when you consider the built in hub functionality.

    If I had money to burn it wouldn't be a problem but there that isn't the case and further my life doesn't wrap around Apple products.
    You get what you pay for - that's my rationalization for buying a $1500 computer when you can get one for $300. And it isn't just a rationalization either, it's true.
    Well the differential might not be as huge as you imply but it is hard to compare as Apple products due to Apple simply not having much in the way of competition for its products. Even if you look at compact computers from the likes of Lonovo they don't have the performance configurations of the Mini.
    Apple just makes the best products and you can't even compete, and they can charge a lot because it is worth it, and there is no competitor to compete price-wise. 
    Some of the products are worth it some not so worth it. The Mac Pro is a joke. In the context of this thread the intro Mini is a nice machine if you don't need that GPU support. Once you get past the intro model though the pricing structure gets a bit whacky.
  • Reply 163 of 1528
    winterwinter Posts: 1,238member
    I am going to cover the iMac a bit and hope that the next line of machines moves up the graphics. 1 GB would be nice although I could see Apple having 768 MB on at least the two 21.5" models and base 27" model. A 4 GB option for high end graphics card if it's available would be great. 1 GB base, 2 GB option for a little more, 4 GB for a little more.
  • Reply 164 of 1528

    Quote:

    Originally Posted by wizard69 View Post





    In the end though is Apple harming itself with these high prices? Lets face it, I do expect to pay a bit. More for Apple products because many times they are worth it. But when you look at some of Apples video monitor solutions it becomes a bit unbearable. This even when you consider the built in hub functionality.

    If I had money to burn it wouldn't be a problem but there that isn't the case and further my life doesn't wrap around Apple products.

    Well the differential might not be as huge as you imply but it is hard to compare as Apple products due to Apple simply not having much in the way of competition for its products. Even if you look at compact computers from the likes of Lonovo they don't have the performance configurations of the Mini.

    Some of the products are worth it some not so worth it. The Mac Pro is a joke. In the context of this thread the intro Mini is a nice machine if you don't need that GPU support. Once you get past the intro model though the pricing structure gets a bit whacky.


    The Mac Pro is the best computer on the market...2 years ago. If it was regularly updated, and I wanted a desktop, I would get it.

  • Reply 165 of 1528
    wizard69wizard69 Posts: 13,377member
    Maybe four years ago now. However the Mac Pro never has been the machine for me, at least some what since the move to Intel. Far to expensive for my needs. I need a desktop not a ripoff.
    The Mac Pro is the best computer on the market...2 years ago. If it was regularly updated, and I wanted a desktop, I would get it.
  • Reply 166 of 1528
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by wizard69 View Post



    Maybe four years ago now. However the Mac Pro never has been the machine for me, at least some what since the move to Intel. Far to expensive for my needs. I need a desktop not a ripoff.




    The 1,1 was a good value at the time it was released. The 3,1 was possibly the cheapest workstation with those specs. My point is that they offered a lot at the base price points at that time. Had they maintained some level of consistency in hardware budgeting, we would have seen the 6 core mac pro as the base option in 2010 rather than an expensive upgrade. Most other workstation vendors offer an equivalent to the single socket mac pro as an entry level model. It's just typically priced much lower. For $2500-3000 in a single package model, you can put together a very capable solution, typically including a mid range workstation gpu. The comparison at the base level only works in favor of the mac pro if you actually need to populate certain expansion options internally. Otherwise the imac can arguably be a better solution. Marvin has provided a couple links in the past regarding people editing on the imac. Most of them seemed to be implemented for things like slip edits. It's not necessarily suitable on its own for something like color grading, as it lacks a lot of calibration features found on higher end displays. You also might run into trouble if your workflow deals with uncompressed 10 bit footage. I use these examples as they represent an area where Apple has maintained a market for some of the higher end hardware. Other brands handle this differently. Dell, HP, etc. get their highest margins by far on workstation sales. They're really dependent on them. I also think it's a misconception that Apple's desktop sales have fallen off a cliff. Part of it is that mobile form factors have merely outpaced desktops in growth. They sell more desktops today than they sold total computers in the early 2000s.

  • Reply 167 of 1528
    winterwinter Posts: 1,238member
    It will be interesting to see how the HD 4600 performs and whether or not they keep the current three line business model or if they go back to two as in 2010 and before.
  • Reply 168 of 1528
    MarvinMarvin Posts: 15,310moderator
    hmm wrote:
    Marvin has provided a couple links in the past regarding people editing on the imac. Most of them seemed to be implemented for things like slip edits. It's not necessarily suitable on its own for something like color grading, as it lacks a lot of calibration features found on higher end displays.

    You can plug in an external display.
    hmm wrote:
    You also might run into trouble if your workflow deals with uncompressed 10 bit footage.

    Not if you have fast enough drives and a 10-bit display.
    winter wrote:
    It will be interesting to see how the HD 4600 performs and whether or not they keep the current three line business model or if they go back to two as in 2010 and before.

    I don't know if you saw this video in the other thread but it looks promising:


    [VIDEO]


    You can't really fake those kind of demos much because they are standard benchmarks. They are showing Haswell running Unigine Heaven in real-time. It looks like it's running with low tessellation (but still enabled) and while it could be on normal quality at 720p, it's still significantly faster than the 6630M.

    Given that it's running in a 17W chip designed for the likes of the Macbook Air, the 45W Mini has plenty of power for it. They can even overclock it in the quad-i7s like they do with the HD 3000.

    It was reported that the Ultrabook chips would be getting the GT3 class GPUs with double the processing units but it would be crazy to put a faster GPU into an Ultrabook than a normal laptop.
  • Reply 169 of 1528
    winterwinter Posts: 1,238member
    I saw that video a while back and it looks okay but I will have to see to believe it.

    Edit: http://www.anandtech.com/show/6600/intel-haswell-gt3e-gpu-performance-compared-to-nvidias-geforce-gt-650m <--- I might be very wrong (wouldn't be the first time)

    And I forget where I doubted if integrated graphics this year being similar to NVIDIA's discrete graphics from last year was on here or MacRumors.

    Edit: Is it mentioned how much memory is being used in the 650M?
  • Reply 170 of 1528
    MarvinMarvin Posts: 15,310moderator
    winter wrote:

    That's a bit harder to believe. The 650M is about 3-4x faster than the HD4000.

    It would be great if true but they did this sort of thing with Ivy Bridge and they've been going around saying they are targeting double the performance, which is far short of the 650M. From the demos, I'm optimistic we'll see around double but the 650M is too much to jump in one year.

    If it was that fast, Apple could put IGPs in the 15" rMBPs and cut the costs down a fair bit. Even though NVidia will be offering more powerful chips, I think people would appreciate a price drop more. Then you can get a refurb of the 2012 model even cheaper with pretty much the same performance.
  • Reply 171 of 1528
    winterwinter Posts: 1,238member
    So is the 640M a more fair guess?
  • Reply 172 of 1528
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Marvin View Post





    If it was that fast, Apple could put IGPs in the 15" rMBPs and cut the costs down a fair bit. Even though NVidia will be offering more powerful chips, I think people would appreciate a price drop more. Then you can get a refurb of the 2012 model even cheaper with pretty much the same performance.


     


    It depends what people are buying the 15". I don't agree with the idea of the 650m as an end goal. Gpus are one of the biggest points of disparity between desktops and notebooks, although I don't know how NVidia's gains will look. Maxwell probably isn't coming prior to 2014. Integrated graphics seem to trail mid range mobile graphics by roughly 2 years. I'd say it's still a bit early to write off discrete graphics, even in intel's wildest dreams. They're currently still aiming at sub $100 gpus. This really hurts NVidia in spite of incredibly thin margins at that level. They're needed to help amortize chip development costs.

  • Reply 173 of 1528
    winterwinter Posts: 1,238member
    I'd be really disappointed if Apple put in an integrated GPU in the 15" retina. Maybe for the next gen for discrete cards, you do 1 GB on the base model and the option to double it on the higher end model or have double the VRAM as standard.
  • Reply 174 of 1528
    MarvinMarvin Posts: 15,310moderator
    winter wrote: »
    So is the 640M a more fair guess?

    I would say so yes. The 650M is only 40% faster than the 640M though so there is a possibility they can make up that performance difference but I'd be surprised if they can do that in a single year. They could be bumping up the power in some of the demos.
    hmm wrote:
    I don't agree with the idea of the 650m as an end goal.

    It wouldn't be an end goal, obviously they'd get faster year on year, but at some point, it makes more sense to share the main memory for computation.
    winter wrote:
    I'd be really disappointed if Apple put in an integrated GPU in the 15" retina

    If it was only the speed of the 640M, it wouldn't be a good idea but if they managed to match the 650M, it would be ok. Especially if removing the dedicated GPU meant dropping the price (I know his hasn't been the case for the 13" though). It would mean longer battery life too.

    NVidia isn't bringing Maxwell until 2014 now:

    http://wccftech.com/nvidia-roadmap-confirms-20nm-maxwell-gpus-2014-kepler-refresh-arrives-1h-2013/

    It does bundle ARM cores apparently so that could offer some nice functionality like booting a low powered mode but the Kepler refresh might not offer much improvement over last year's models. Maybe they'll switch all the GPUs back to AMD this year.
  • Reply 175 of 1528
    wizard69wizard69 Posts: 13,377member
    Marvin wrote: »
    I would say so yes. The 650M is only 40% faster than the 640M though so there is a possibility they can make up that performance difference but I'd be surprised if they can do that in a single year. They could be bumping up the power in some of the demos.
    It wouldn't be an end goal, obviously they'd get faster year on year, but at some point, it makes more sense to share the main memory for computation.
    Sharing main memory is one of those mixed bags of hurt. It can be very beneficial but it can also be a huge negative impact on overall performance. Both AMD and Intel have been guiding builders to use fast RAM subsystems to get reasonable performance from their APU type hardware. It is such a serious problem that I would expect that both companies will be taking steps to address the memory bandwidth needs in their APUs. Of course some apps do benefit from mixed use of system RAM and that may grow as systems become more heterogeneous in their use of resources in the system.

    If it was only the speed of the 640M, it wouldn't be a good idea but if they managed to match the 650M, it would be ok. Especially if removing the dedicated GPU meant dropping the price (I know his hasn't been the case for the 13" though). It would mean longer battery life too.
    I'm still thinking it will be 2015 before we see the pros without a discrete GPU. As noted we end up with that mixed bag of hurt. The other real issue is that 1GB of dedicated VRAM or even 2GB is a significant amount of RAM to free up for the main processor. What is best is often a question best answered by the individual user. As mentioned in another thread I could easily see Apple going with integrated GPUs in the MBP and keeping the discrete a in the retina machines. This might happen with Haswell or they might wait for the next hardware rev from Intel. It would however give people a choice.
    NVidia isn't bringing Maxwell until 2014 now:

    http://wccftech.com/nvidia-roadmap-confirms-20nm-maxwell-gpus-2014-kepler-refresh-arrives-1h-2013/

    It does bundle ARM cores apparently so that could offer some nice functionality like booting a low powered mode but the Kepler refresh might not offer much improvement over last year's models. Maybe they'll switch all the GPUs back to AMD this year.

    I really don't know why they left AMD for discrete GPUs anyways. I just don't see the advantage in doing business with NVidia. Especially considering that AMDs hardware appears to be far more reliable. Unjustified flopping back and forth must be frustrating for vendors and is likely a big negative when it comes to partnerships. I wouldn't be surprised if this isn't part of the reason that Apples drivers such so much.
  • Reply 176 of 1528
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by wizard69 View Post





    Sharing main memory is one of those mixed bags of hurt. It can be very beneficial but it can also be a huge negative impact on overall performance. Both AMD and Intel have been guiding builders to use fast RAM subsystems to get reasonable performance from their APU type hardware. It is such a serious problem that I would expect that both companies will be taking steps to address the memory bandwidth needs in their APUs. Of course some apps do benefit from mixed use of system RAM and that may grow as systems become more heterogeneous in their use of resources in the system.

    I'm still thinking it will be 2015 before we see the pros without a discrete GPU.


    That sounds about right. I'm not sure what NVidia will do with shrinking volume. They're incredibly reliant on numbers to cover fabrication and development costs. Discrete gpus typically use much faster ram on board, and its placement is obviously ideal. I'm not surprised that Intel and AMD would be pushing for higher memory speeds if they want to push heterogeneous computing. That is one of those things that I've been watching at the desktop/workstation level. Notebooks are going to continue to grow in terms of portability. I'm curious if gpus will become fully commoditized or if we'll see greater use of them to drive workstation sales. Dell, HP, and a number of the smaller vendors still derive a huge portion of their profits from that segment in spite of its size. Even HP was never considering spinning off their business hardware.

  • Reply 177 of 1528
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by wizard69 View Post



    I really don't know why they left AMD for discrete GPUs anyways. I just don't see the advantage in doing business with NVidia. Especially considering that AMDs hardware appears to be far more reliable. Unjustified flopping back and forth must be frustrating for vendors and is likely a big negative when it comes to partnerships. I wouldn't be surprised if this isn't part of the reason that Apples drivers such so much.


    I don't entirely share this sentiment. NVidia has been okay on Macs in recent years. The only recent blip was the launch of the Quadro 4000. Its initial drivers were terrible. You do pick up CUDA capability with NVidia, but somehow I doubt Apple cares about that. It is relevant in a number of shipping applications today. AMD has been quite stable on OSX, so my guess is price. NVidia was likely cheaper this round.

  • Reply 178 of 1528
    wizard69wizard69 Posts: 13,377member
    hmm wrote: »
    That sounds about right. I'm not sure what NVidia will do with shrinking volume. They're incredibly reliant on numbers to cover fabrication and development costs.
    Yep and they are not doing well with their ARM initiatives.
    Discrete gpus typically use much faster ram on board, and its placement is obviously ideal. I'm not surprised that Intel and AMD would be pushing for higher memory speeds if they want to push heterogeneous computing. That is one of those things that I've been watching at the desktop/workstation level. Notebooks are going to continue to grow in terms of portability. I'm curious if gpus will become fully commoditized or if we'll see greater use of them to drive workstation sales.
    Most likely discrete GPUs will become very expensive in the future.
    Dell, HP, and a number of the smaller vendors still derive a huge portion of their profits from that segment in spite of its size. Even HP was never considering spinning off their business hardware.
    The whole industry is sort of self destruct mode. These companies are just to big to survive off high end work stations only. Sadly I think a few of these companies will have to go under and we will see the industry restructure.
  • Reply 179 of 1528
    wizard69wizard69 Posts: 13,377member
    hmm wrote: »
    I don't entirely share this sentiment. NVidia has been okay on Macs in recent years. The only recent blip was the launch of the Quadro 4000. Its initial drivers were terrible. You do pick up CUDA capability with NVidia, but somehow I doubt Apple cares about that. It is relevant in a number of shipping applications today. AMD has been quite stable on OSX, so my guess is price. NVidia was likely cheaper this round.

    The big problem with NVidia is that they have been dragged into support of OpenCL and even then their compute performance is wanting. CUDA is quickly becoming marginalized as a technology.
  • Reply 180 of 1528
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by wizard69 View Post





    Most likely discrete GPUs will become very expensive in the future.

    The whole industry is sort of self destruct mode. These companies are just to big to survive off high end work stations only. Sadly I think a few of these companies will have to go under and we will see the industry restructure.


    I suspect discrete gpus will become more expensive, and NVidia will need some way to absorb fabrication costs. Right now even the gaming Fermi cards are still so far ahead of integrated graphics in areas that don't involve gaming, such as computation or OpenGL heavy apps. Even if every new product involved an integrated gpu starting today, it would take some time for them to filter through. The slight upside to this is that it would drive improvements in that area. Intel clearly doesn't see it going 100% integrated just yet. Their E5s/EP cpu packages do not ship with integrated gpus. EN has been shipping with integrated at least since Sandy. NVidia hasalready cut out some of their development partners in favor of producing cards directly. Tom's hardware had some speculation that they'll try to implement ARM in their Tesla solutions. They obviously have some kind of plan, but I'm not knowledgeable enough in terms of hardware development to offer any real insight.


    Quote:

    Originally Posted by wizard69 View Post





    The big problem with NVidia is that they have been dragged into support of OpenCL and even then their compute performance is wanting. CUDA is quickly becoming marginalized as a technology.




    I don't entirely agree there. Fermi blew everything else away in double precision math, and NVidia was the one that really pioneered GPGPU. OpenCL came along after, and in some applications NVidia is still preferred. It's just much less so on OSX. They're actually quite strong in that area. They even have an entire gpu line with a huge amount of on board ram dedicated to such functions with Tesla computing. Some of AMD's driver problems do not exist in OSX, which is probably a reason for the differing balance of power. I suspect the market for performance machines will hold out for a number of years. If the primary focus in consumer technology is on mobile devices with minimized power consumption, it will leave a gap in the market for something that offers more performance beyond just a thin client hooked up to a server farm. HP seems quite happy with themselves when it comes to workstations, which isn't the segment they considered spinning off. They make too much on them to even consider that. I wonder how the Z1 has worked out for them.

Sign In or Register to comment.