or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › 2014 Mac mini Wishlist
New Posts  All Forums:Forum Nav:

2014 Mac mini Wishlist - Page 9

post #321 of 1506

Your numbers below support my point completely, you give up a little but of CPU performance for a more versatile and better performing GPU.     For many uses that is the better trade off.   Since AMD has a deep well of GPU IP I expect them to keep that differential with Intel.   

 

Tis is only significant if you look at the integrated solutions, if your focus is on discrete GPUs then you might as well go for Intels better CPU performance. 

 

Finally the numbers below mean nothing without a context.   

Quote:
Originally Posted by mjteix View Post

Apple is using exactly the same cpus in the Mac mini than in the 13" and 15" MBPs. If Apple isn't exactly using top of the line performance in the Mini, there are still using very good Core i5/i7 parts with the best igpu Intel has to offer. But that doesn't matter, the results are the same: the cpu gap with AMD is HUGE!

 

Just look at geekbench scores those parts:

Intel Core i7-3720QM 2600 MHz (4 cores) 10132 

Intel Core i7-3615QM 2300 MHz (4 cores) 9037 

Intel Core i5-3210M 2500 MHz (2 cores) 5742 

Intel Core i5-2410M 2300 MHz (2 cores) 5052

AMD A10-4600M 2300 MHz (4 cores) 4235

 

Or cpu benchmarks:

Intel Core i7-3720QM @ 2.60GHz 8,468 

Intel Core i7-3615QM @ 2.30GHz 7,481 

Intel Core i5-3210M @ 2.50GHz 3,818 

Intel Core i5-2410M @ 2.30GHz 3,196

AMD A10-4600M APU @ 2.30GHz 3,072

 

The cpu in the $799 Mac mini is more than two times faster than the best mobile APU AMD offers. And that Intel cpu is not even top of the line performance. But even desktop apus can't achieve the cpu performance of an Intel mobile quad-core cpu. I'd rather take a small "hit" in graphics than to cut the cpu performance in half.

 

There's a reason why those APUs are cheap, they suck! Even AMD officially compares them to Intel's Core i3 parts, not Core i5, let alone Core i7. Yeah, they have nice graphics, so what?

post #322 of 1506
Thread Starter 
It's a shame they can't collaborate and you have a powerful Intel chip with AMD graphics and that probably would compare to a 650M.
post #323 of 1506

I see no need for AMD to hook up with Intel.   Right now it is an issue of public perception and a focus on CPU performance that needs to change.   In a modern computer CPU performance isn't everything, for many users a well performing GPU is more important even if hey don't realize that.  

 

Beyond that some of these old benchmarks don't reflect the importance of software built with the latest compiler tools which can enhance Piledriver performance.    To use this AMD hardware without optimizing for it would be a mistake.  Also it isn't that hard to find tests that indicate AMD delivering better system behavior than Intel based hardware.  In the end it is really a question of how much performance you can get for a minimal cost.  

 

 

Quote:
Originally Posted by Winter View Post

It's a shame they can't collaborate and you have a powerful Intel chip with AMD graphics and that probably would compare to a 650M.
post #324 of 1506
Quote:
Originally Posted by Winter View Post

 that probably would compare to a 650M.

Where do you get that idea? Has anyone suggested they have obtained that level of performance? It could also have certain requirements in terms of die space and design that would not work with intel, and of course those two companies have little experience working together. If you want optimal performance, you need some form of discrete graphics today. Integrated can actually be good enough for a number of things compared to where they were a few years ago. If discrete graphics die out, I suspect it will be due to insufficient volume to overcome fixed development costs. That is typically how these things go. NVidia was able to leverage out a lot of the graphics workstation vendors in the 1990s. They sold a limited number of units at tens of thousands of dollars each. NVidia used a rise in PC gaming to further desktop graphics research. Today that business model is becoming problematic. It may not be sustainable. I don't expect Haswell to even approach it outside of a limited number of synthetic tests. Intel's problems haven't just been one of raw hardware, but one of implementation. There are many use cases where the only thing that makes me occasionally hesitant to suggest them as good enough is the number of annoying gotchas. The HDMI issue comes to mind. In terms of performance, it's fine for a lot of things. It's not great for gaming, but real time 3d graphics really do push the hardware. They're one of the most demanding non-work related use cases.

post #325 of 1506
Quote:
Originally Posted by wizard69 View Post

Your numbers below support my point completely, you give up a little but of CPU performance for a more versatile and better performing GPU.     For many uses that is the better trade off.   Since AMD has a deep well of GPU IP I expect them to keep that differential with Intel.   

 

Tis is only significant if you look at the integrated solutions, if your focus is on discrete GPUs then you might as well go for Intels better CPU performance. 

 

Finally the numbers below mean nothing without a context.   

 

It's funny how the same numbers can "mean nothing" and "support your point completely" at the same time.

 

You're in denial. I can understand that. You don't really know what little, versatile, and many really means anymore. You have a hard time understanding how 9037 relates to 4235. Or 7481 vs 3072. It happens. It's not getting easier as you're getting old.

post #326 of 1506
Originally Posted by mjteix View Post
It's funny how the same numbers can "mean nothing" and "support your point completely" at the same time.

 

You're in denial. I can understand that. You don't really know what littleversatile, and many really means anymore. You have a hard time understanding how 9037 relates to 4235. Or 7481 vs 3072. It happens. It's not getting easier as you're getting old.

 

Here is the problem, everybody knows that AMDs CPUs don't perform as well as Intels.   Nobody can rationally argue with that.    The point I'm making is it doesn't matter for many users because GPU performance is more important for the overall experience.  If you can't grasp that then the discussion won't go anywhere.    Beyond that hand picked benchmarks to support your position don't mean a whole hell of a lot because we can hand pick benchmarks showing AMDs GPUs running 50 to a 100% faster than Intels.    So yeah you supported my point but it really doesn't mean much.  

 

In any event you really should try to address my point, AMD has a better offering for a low cost machine.   This is especially the case if the user needs good GPU support for their usage patterns.

post #327 of 1506

It will be very interesting to see how Haswell performs when it hits the market.  

 

Intel would likely loose more working with AMD than it is worth.   Intel seems to be very willing to have their GPUs publicly trashed at every opportunity.  However I don't see die space as being an issue with an AMD GPU integrated on an Intel die, there is just an amazing amount of room on today's dies and most of that room already goes to the GPUs integrated on board.  

 

The split between integrated and discrete will likely become far more interesting in 2013 and beyond.   At some point things will flip and it will be more of a disadvantage to have a discrete GPU chip.   That will likely happen with unified memory access and very high bandwidth to main memory.    Heterogeneous computing is here to stay but won't really come into its own until more mature hardware is delivered.  Once the GPU is an equal to the CPU, on the memory bus, I see a quick decline in discrete GPU support.    As you note that will dry up funs to support discrete GPU development.   

 

I share with you the reluctance to recommend intel only GPU solutions.  Intels integrated solutions are no where near as good as AMDs and have significant issues that drivers can't correct.  Speaking of drivers, Intel is a mixed bag here, some issues just hang around to long.  By the way Intels OpenCL support isn't much better than its 3D support.  In any event I'm really hoping that Intel can iron out the hardware glitches with Haswell and also squash the driver issues.  If intel can deliver decent 3D and OpenCL support then the reservations about recommending them should melt away.  

 

The problem today is that because of these issues Intel is a poor choice for Integrated GPU only machines.  We can only hope that driver issues get corrected right now.   Sadly this sucks due to the importance of GPUs in modern systems.   These days GPU acceleration sneaks into every thing from web browsers to decompression utilities, as such lagging GPU support negatively impacts the user experience.  

Quote:
Originally Posted by hmm View Post

Where do you get that idea? Has anyone suggested they have obtained that level of performance? It could also have certain requirements in terms of die space and design that would not work with intel, and of course those two companies have little experience working together. If you want optimal performance, you need some form of discrete graphics today. Integrated can actually be good enough for a number of things compared to where they were a few years ago. If discrete graphics die out, I suspect it will be due to insufficient volume to overcome fixed development costs. That is typically how these things go. NVidia was able to leverage out a lot of the graphics workstation vendors in the 1990s. They sold a limited number of units at tens of thousands of dollars each. NVidia used a rise in PC gaming to further desktop graphics research. Today that business model is becoming problematic. It may not be sustainable. I don't expect Haswell to even approach it outside of a limited number of synthetic tests. Intel's problems haven't just been one of raw hardware, but one of implementation. There are many use cases where the only thing that makes me occasionally hesitant to suggest them as good enough is the number of annoying gotchas. The HDMI issue comes to mind. In terms of performance, it's fine for a lot of things. It's not great for gaming, but real time 3d graphics really do push the hardware. They're one of the most demanding non-work related use cases.

post #328 of 1506
Quote:
Originally Posted by wizard69 View Post

 

The split between integrated and discrete will likely become far more interesting in 2013 and beyond.   At some point things will flip and it will be more of a disadvantage to have a discrete GPU chip.   That will likely happen with unified memory access and very high bandwidth to main memory.    Heterogeneous computing is here to stay but won't really come into its own until more mature hardware is delivered.  Once the GPU is an equal to the CPU, on the memory bus, I see a quick decline in discrete GPU support.    As you note that will dry up funs to support discrete GPU development.   

 

 

NVidia has been cutting out gpu partners and doing whatever they can to aid margins due to this problem. It could be that high end systems use two gpus like we see in macbook pros today, but I don't know that this would be sufficient to keep them going. Desktop graphics were initially driven by gaming. Integrated graphics are driven by mobile technologies. It's nothing new. The advancements often come from areas where the potential volume ensures that investments in development can be recouped. I could always use the minicomputers could never displace mainframes example1biggrin.gif. I think the higher end gpus will stick around longer. Even if they stopped developing today, some workstation gpus could be sold for several years due to the sheer gap in hardware performance and drivers when compared to intel's offerings. I expect the transition in notebooks will happen sooner. I expect some level of consolidation as computing hardware shakes out, but I'm not entirely sure what will and won't be left yet. Some level of hybridization seems likely in the notebook area. People on here have claimed they're different things. They claimed the same thing about phones and pdas prior to Handspring and RIM. If you recall Palm bought Handspring after the Treo. I can't see myself going without a large display. Even with slow development in that area today, the designs are still sold. They'll probably just become further commoditized with televisions and bin parts based on performance.

 

Quote:
I share with you the reluctance to recommend intel only GPU solutions.  Intels integrated solutions are no where near as good as AMDs and have significant issues that drivers can't correct.  Speaking of drivers, Intel is a mixed bag here, some issues just hang around to long.  By the way Intels OpenCL support isn't much better than its 3D support.  In any event I'm really hoping that Intel can iron out the hardware glitches with Haswell and also squash the driver issues.  If intel can deliver decent 3D and OpenCL support then the reservations about recommending them should melt away.

It's one of those things where I think a portion of users will be limited by driver and implementation issues rather than the raw potential of the chip hardware. Graphic design is one of those areas that comes up occasionally. Outside of motion graphics, I think the mini works there and opens up a range of display options if said designer isn't simply looking to dock their notebook to something. The only thing that makes me a little hesitant at times is the potential for bugs, which is why I'll typically suggest buying it from a place with a good return policy as a just in case measure. Things like 2D raster or vector graphics and video content aren't a problem for modern gpu hardware, even at the low end. If you're inhibited by something, it's probably more of a bug issue.

post #329 of 1506

Well it will be interesting to see what happens to NVidia over the next couple of years.   One thing is for sure I'm not buying into that company as an investment.    The Mac Pro in a way reflects on the high end GPU market, there just isn't a lot of demand to drive investment as you have indicated.   NVIdia went after the high performance computing sector but even there I don't see enough demand to justify development, eventually the price ends up so high it ends up being cheaper to just implement more standard processors.

 

As to selling into the workstation market that is certainly a possibility however even there the impact of integrated GPU's can not be underestimated.   Either with Haswell or its follow on there will be enough performance to support the more modest workstation needs.   Here I'm talking about the fabled midrange machines that sit below Mac Pro class machines to provide better than average video performance to drive workstation class apps.

Quote:
Originally Posted by hmm View Post

 

NVidia has been cutting out gpu partners and doing whatever they can to aid margins due to this problem. It could be that high end systems use two gpus like we see in macbook pros today, but I don't know that this would be sufficient to keep them going. Desktop graphics were initially driven by gaming. Integrated graphics are driven by mobile technologies. It's nothing new. The advancements often come from areas where the potential volume ensures that investments in development can be recouped. I could always use the minicomputers could never displace mainframes example1biggrin.gif. I think the higher end gpus will stick around longer. Even if they stopped developing today, some workstation gpus could be sold for several years due to the sheer gap in hardware performance and drivers when compared to intel's offerings. I expect the transition in notebooks will happen sooner. I expect some level of consolidation as computing hardware shakes out, but I'm not entirely sure what will and won't be left yet. Some level of hybridization seems likely in the notebook area. People on here have claimed they're different things. They claimed the same thing about phones and pdas prior to Handspring and RIM. If you recall Palm bought Handspring after the Treo. I can't see myself going without a large display. Even with slow development in that area today, the designs are still sold. They'll probably just become further commoditized with televisions and bin parts based on performance.

 

It's one of those things where I think a portion of users will be limited by driver and implementation issues rather than the raw potential of the chip hardware. Graphic design is one of those areas that comes up occasionally. Outside of motion graphics, I think the mini works there and opens up a range of display options if said designer isn't simply looking to dock their notebook to something. The only thing that makes me a little hesitant at times is the potential for bugs, which is why I'll typically suggest buying it from a place with a good return policy as a just in case measure. Things like 2D raster or vector graphics and video content aren't a problem for modern gpu hardware, even at the low end. If you're inhibited by something, it's probably more of a bug issue.

Drivers can be a real kicker.   However driver issues can be more or less a universal issue as it does not matter if your needs are high end or low end if you get hung up on a driver issue your opinion of the hardware is bound to be negative.    With the Mini though I have to disagree, the raw performance of the Intel integrated solution is still a problem for many users.   While I agree that 2D performance isn't that bad once you leave that area of concern the hardware doesn't look that good.

post #330 of 1506
Quote:
Originally Posted by wizard69 View Post

Well it will be interesting to see what happens to NVidia over the next couple of years.   One thing is for sure I'm not buying into that company as an investment.    The Mac Pro in a way reflects on the high end GPU market, there just isn't a lot of demand to drive investment as you have indicated.   NVIdia went after the high performance computing sector but even there I don't see enough demand to justify development, eventually the price ends up so high it ends up being cheaper to just implement more standard processors.

Workstations have shown minimal growth, although it's nowhere near what they had a few years ago. Overall sales in that market segment dropped in 2011, but that was partly due to the lack of new hardware. I wouldn't say NVidia focuses solely on the high end market. They just derive a lot of their profits from it. They still need the cheaper markets for their volume. Without cheap gpus they wouldn't be able to maintain the pricing of Teslas. A big selling point was performance per dollar relative to X86 solutions. Without some way to amortize chip development through higher volume products, they will be in trouble. I suspect their board has discussed this though.

Quote:
As to selling into the workstation market that is certainly a possibility however even there the impact of integrated GPU's can not be underestimated.   Either with Haswell or its follow on there will be enough performance to support the more modest workstation needs.

At some point I expect integrated graphics to show up in Xeon EN packages. That will be the start of it. The mac pro actually uses EP, but EN is closer to the desktop cpu designs.

 

Quote:

Drivers can be a real kicker.   However driver issues can be more or less a universal issue as it does not matter if your needs are high end or low end if you get hung up on a driver issue your opinion of the hardware is bound to be negative.    With the Mini though I have to disagree, the raw performance of the Intel integrated solution is still a problem for many users.   While I agree that 2D performance isn't that bad once you leave that area of concern the hardware doesn't look that good.

 

 

That (drivers) is why I'm not always 100% confident in what to suggest. Apples price to performance ratio on graphics hardware isn't very good. They tend to enforce a high minimum sale on anything I would consider credible.

post #331 of 1506
NVidia is betting on Tegra for their future.
post #332 of 1506

So far that isn't working out too well for them.  

 

I don't think Tegra is a bad idea but rather it isn't focused enough to attract a lot of design ins.    That and NVidia needs to bone up on low power GPUs.  

 

 

Quote:
Originally Posted by nht View Post

NVidia is betting on Tegra for their future.
post #333 of 1506
Quote:
Originally Posted by hmm View Post

Workstations have shown minimal growth, although it's nowhere near what they had a few years ago. Overall sales in that market segment dropped in 2011, but that was partly due to the lack of new hardware. I wouldn't say NVidia focuses solely on the high end market. They just derive a lot of their profits from it. They still need the cheaper markets for their volume. Without cheap gpus they wouldn't be able to maintain the pricing of Teslas. A big selling point was performance per dollar relative to X86 solutions. Without some way to amortize chip development through higher volume products, they will be in trouble. I suspect their board has discussed this though.

Right now it doesn't look pretty. Tegra is a wise move in concept but I don't think they are focused on that correctly as the Tegra solutions released so far are hacks. They really need to deliver a well integrated 64 bit ARM solution or they will get steam rolled in this market.

At some point I expect integrated graphics to show up in Xeon EN packages. That will be the start of it. The mac pro actually uses EP, but EN is closer to the desktop cpu designs.

 

 

I'm not too sure, It is a matter of what is important in the markets they sell those chips into.
 

 

That (drivers) is why I'm not always 100% confident in what to suggest. Apples price to performance ratio on graphics hardware isn't very good. They tend to enforce a high minimum sale on anything I would consider credible.

 

Yeah it kinda sucks but Intel is getting there. I just don't know when good enough will happen.

post #334 of 1506
Thread Starter 
I'm jumping ahead here but this topic is all over the place so who cares. By the time Broadwell hits, will we see DDR4 16 GB SO-DIMMS so we can have two modules = to 32 GB of memory in a mini?
post #335 of 1506
Originally Posted by Winter View Post
By the time Broadwell hits, will we see DDR4 16 GB SO-DIMMS…

 

If Broadwell is supposed to allow DDR4 chips across the board, which I don't know. Haswell uses it in just the server model.


…so we can have two modules = to 32 GB of memory in a mini?

 

A better question would be if there is anything actually preventing this now, other than chipset limitations.

post #336 of 1506
Quote:
Originally Posted by Tallest Skil View Post

 

A better question would be if there is anything actually preventing this now, other than chipset limitations.

The current limitation would be availability of sodimms. Notebooks can run 32GB. Lenovo and some of the others have notebooks that will take 4 sodimms, and you can find many reports of people running 32GB this way. Much of the time they need it to smoothly run multiple VMs.

post #337 of 1506
Thread Starter 
Do you think Apple finally gives up and allows for more video memory in their iMacs by default. Do they finally go 512 MB to 1 GB at least in the lower end 27"?

I know what people are going to say or possibly say:

Winter - But they want people to buy the 27" ultimate.

Okay, then why have shitty versions? Cut the entry 21.5" with the 640M and make the lowest end model with 650M starting at $1,499? For $100 double the video memory.
post #338 of 1506
Originally Posted by Winter View Post
Do you think Apple finally gives up and allows for more video memory in their iMacs by default. Do they finally go 512 MB to 1 GB at least in the lower end 27"?

 

They don't have control over that with the integrated chips. If they wanted to position themselves more strongly in this arena, they'd choose the highest amount available for each processor, but even then that isn't very much.

 

Right now I'm just hanging on for the next Mac Pro's GPUs and hoping there's at least 2GB minimum there. 512MB is really starting to not cut it in Space Engine, for example. Heck, my card isn't supposed to be able to run Space Engine at all… 

post #339 of 1506
Thread Starter 
Quote:
Originally Posted by Tallest Skil View Post

They don't have control over that with the integrated chips. If they wanted to position themselves more strongly in this arena, they'd choose the highest amount available for each processor, but even then that isn't very much.

Right now I'm just hanging on for the next Mac Pro's GPUs and hoping there's at least 2GB minimum there. 512MB is really starting to not cut it in Space Engine, for example. Heck, my card isn't supposed to be able to run Space Engine at all… 

Well Kepler cards have 2 GB minimum and I don't see them including 2 GB in a 640M or even a 650M (except I would like it for the rMBP to drive the screen). I just wonder about not including at least an option 1 GB for the GT 650M and GTX 660M. 512 MB might be fine for a base model but even then it's kind of meh.

Which iMacs sell the most? 21.5"? or 27"?

Like say with the MacBook Pros, I know the classic unibody 13" was huge for them.
post #340 of 1506
Quote:
Originally Posted by Tallest Skil View Post

 

They don't have control over that with the integrated chips. If they wanted to position themselves more strongly in this arena, they'd choose the highest amount available for each processor, but even then that isn't very much.

 

Right now I'm just hanging on for the next Mac Pro's GPUs and hoping there's at least 2GB minimum there. 512MB is really starting to not cut it in Space Engine, for example. Heck, my card isn't supposed to be able to run Space Engine at all… 


1GB wasn't even high when the mac pro 2010 revision debuted. That would be the area where it really irritates me. The common "average user just checks facebook" argument isn't exactly valid when we restrict the comparison to workstation level hardware. Check out the P90X of graphics cards.  It's a pretty extreme comparison, but I'm not sure any of the current mac gpus including the 680mx could load even half of that without crashing. Companies probably buy hardware like that for hero suites so that it can be displayed for clients at full resolution without exporting to an offline renderer when a change is made. That's my guess.

post #341 of 1506
Originally Posted by hmm View Post
Check out the P90X of graphics cards.

 

No direct relation to the workout routine, but performance comparable thereto, huh? What're the odds… lol.gif

post #342 of 1506
Quote:
Originally Posted by Tallest Skil View Post

 

No direct relation to the workout routine, but performance comparable thereto, huh? What're the odds… lol.gif


Oh it's not called the P90X, although that would be completely awesome. I was making a steroid abuse joke looking at its specs. If you've ever seen one of the commercials, even the guy's skull looks buff.

post #343 of 1506
Thread Starter 
At the very least, they did max out the 2 GB available to the 680MX. Also the iMac is not meant for everyone and hopefully Apple in the back of their mind will take a stance thinking that it is.
post #344 of 1506

I think you mis one important aspect here, not everyone needs or wants a high performance video card.    This is especially the case with iMac users.  The upswell arguement is pretty bogus at times as Apple has no problems moving the lower end iMacs.   The issue with the Mini is more complex as I really don't think Apple gets it when it comes to positioning the Mini.   In essence every version of the Mini has crappy graphics.  

 

In any event realize that your obsession with video card memory will soon be a thing of the past.   Once systems are fully heterogeneous the GPU will have access to all system memory.  The remaining trick is making that video memory fast enough.  

Quote:
Originally Posted by Winter View Post

Do you think Apple finally gives up and allows for more video memory in their iMacs by default. Do they finally go 512 MB to 1 GB at least in the lower end 27"?

I know what people are going to say or possibly say:

Winter - But they want people to buy the 27" ultimate.

Okay, then why have shitty versions? Cut the entry 21.5" with the 640M and make the lowest end model with 650M starting at $1,499? For $100 double the video memory.
post #345 of 1506
Thread Starter 
Quote:
Originally Posted by wizard69 View Post

I think you mis one important aspect here, not everyone needs or wants a high performance video card.    This is especially the case with iMac users.  The upswell arguement is pretty bogus at times as Apple has no problems moving the lower end iMacs.   The issue with the Mini is more complex as I really don't think Apple gets it when it comes to positioning the Mini.   In essence every version of the Mini has crappy graphics.  

In any event realize that your obsession with video card memory will soon be a thing of the past.   Once systems are fully heterogeneous the GPU will have access to all system memory.  The remaining trick is making that video memory fast enough.  

The only reason I have an obsession with graphics and video card memory in general is so I can play Gauntlet Legends/Dark Legacy on MAME OS X as well as some other games such as Time Crisis. Maybe that's a CPU thing though but I know you need a better video card than Intel HD 3000.

Edit: I also have an obsession with the Intel HD 4000 since there is no fix for HDMI. That being said, I kind of do have my eyes on an NEC monitor but I am too used to using my HDTV as my monitor.
Edited by Winter - 2/2/13 at 4:46pm
post #346 of 1506
Quote:
Originally Posted by Winter View Post


Edit: I also have an obsession with the Intel HD 4000 since there is no fix for HDMI. That being said, I kind of do have my eyes on an NEC monitor but I am too used to using my HDTV as my monitor.

 

I like NEC. Spectraview can be buggy as hell at times, but NEC has the best price to quality ratio of any display brand on the market, especially in the US. Also my biggest issues with integrated graphics would be drivers and features rather than absolute performance. Basic OpenCL support in OSX and smooth drivers would go a long way for issues outside of gaming. It's not always a performance thing. I've mentioned graphic design and illustration at times. The HD 4000 is probably fast enough there. The only issues relate to things like how much memory it can address, lack of OpenCL in OSX, and driver issues such as the HDMI problems you mentioned (although for that use case they should be using displayport).


Edited by hmm - 2/2/13 at 6:30pm
post #347 of 1506

I understand what you want, I just don't see the point in criticizing the low end models when many do fine with those models.   I like you want a little more out of my video hardware and that is why I see the 2012 Mini upsell model as a big fail.    The lack of a Mini with a higher performance video subsystem is a big problem.   

Quote:
Originally Posted by Winter View Post


The only reason I have an obsession with graphics and video card memory in general is so I can play Gauntlet Legends/Dark Legacy on MAME OS X as well as some other games such as Time Crisis. Maybe that's a CPU thing though but I know you need a better video card than Intel HD 3000.

Edit: I also have an obsession with the Intel HD 4000 since there is no fix for HDMI. That being said, I kind of do have my eyes on an NEC monitor but I am too used to using my HDTV as my monitor.

I'm with you on Intel driver issues.  Funny but I connect an HDTV to my MBP often, frankly it sucks for anything involving text.   Great for videos though.  

post #348 of 1506
Thread Starter 
Quote:
Originally Posted by wizard69 View Post

I understand what you want, I just don't see the point in criticizing the low end models when many do fine with those models.   I like you want a little more out of my video hardware and that is why I see the 2012 Mini upsell model as a big fail.    The lack of a Mini with a higher performance video subsystem is a big problem.
  
Quote:
I'm with you on Intel driver issues.  Funny but I connect an HDTV to my MBP often, frankly it sucks for anything involving text.   Great for videos though.  

Most games perform just fine with the Intel HD 3000, it's just anything involving 3D which is also an emulation issue. I can run a lot of classics including all of the Mortal Kombat games, anything Neo-Geo, Street Fighter and Street Fighter Alpha (don't know about SFIV), though stuff such as Tekken 3, Soul Calibur, etc. require some work.

I have no problems with HDMI and the Intel HD 3000 (though I don't think any issues ever existed did they) and it looks best at 1366x768 (which I know a lot of people hate) though my TV is a Vizio VO320 32".
post #349 of 1506

I agree with you i have this model and the small amount of games I play is perfect for me.Besides most of these games are plain violent and should not even be in the market today.
 

post #350 of 1506
Thread Starter 
Quote:
Originally Posted by marvfox View Post

I agree with you i have this model and the small amount of games I play is perfect for me.Besides most of these games are plain violent and should not even be in the market today.

 

I am going to slightly disagree only because there are a lot of popular things people like that I personally don't like from TV shows to video games to movies. Why did Jersey Shore or Honey Boo Boo have to get green-lighted?

Anyway, the problem is RPGs are so limited on the PC but plentiful on consoles. I loved Diablo I and II. They were perfect. Diablo III to me wasn't as fun and didn't live up to the hype.

Now as I understand it, "GT3" from Haswell processors is going to go into anything mobile (MacBook Air, Pro, and mini). They will be an improvement over Ivy Bridge and an obvious improvement over Sandy Bridge.

I will probably need a quad-core model though since I am getting the Apple SSD. I know it's cheaper to do it yourself but I doubt they're offering one on the Haswell dual-core entry level model.
post #351 of 1506
Quote:
Originally Posted by Winter View Post


I am going to slightly disagree only because there are a lot of popular things people like that I personally don't like from TV shows to video games to movies. Why did Jersey Shore or Honey Boo Boo have to get green-lighted?

Cheap to produce + a lot of people like trash even if they won't admit to it. I think some people just watch those things out of shock. They can't believe the show would be that bad, so they watch it to find out. Following people around with cameras is cheap compared to building sets and budgeting for vfx.

post #352 of 1506

These processors you are referring to have more of an advantage when playing these games you mention.I have Sandy Bridge now and when this Haswell comes out will the difference be that tremendous actually?
 

post #353 of 1506
Thread Starter 
Quote:
Originally Posted by marvfox View Post

These processors you are referring to have more of an advantage when playing these games you mention.I have Sandy Bridge now and when this Haswell comes out will the difference be that tremendous actually?

 

The processor speed might not be but the graphics will be. Also I am more than likely looking to step up from a Sandy Bridge dual-core to a Haswell quad-core so I will see a huge difference.
post #354 of 1506

When is this new revelation coming out? Graphics really if you play games a lot I would think.
 

post #355 of 1506
Thread Starter 
I actually do play quite a bit of games (not the latest but a few) plus I'll be ready to recycle my current Mac mini once the new one hits later this year. 1 year is too short, 2 years is just right... your mileage may vary say if you bought a $2,000+ MBP, iMac, or Mac Pro but for a Mini it's perfect if of course you choose to.
post #356 of 1506
Thread Starter 
1 TB Fusion Drive being added to the base model mini as a BTO option? Yes? No?

You have to figure that they'll at least upgrade from a 500 GB HDD to a 1 TB HDD at 5,400 rpm even if the option were not to be added right?

Do they upgrade the SSD options by the time the next one comes out? Maybe a 1 TB Fusion, 256 GB SSD, and 512 GB SSD for the one model and for the server you get two 1 TB HDDs and 2x256 GB SSDs 2x512 GB SSDs are options?
post #357 of 1506
Originally Posted by Winter View Post
1 TB Fusion Drive being added to the base model mini as a BTO option? Yes? No?

 

They did it for the current iMac just a few weeks ago. I imagine even this model will get it.

post #358 of 1506
Thread Starter 
Quote:
Originally Posted by Tallest Skil View Post

They did it for the current iMac just a few weeks ago. I imagine even this model will get it.

Do you think they will upgrade to a normal 1 TB HDD standard for the base model or is that asking too much?
post #359 of 1506
Originally Posted by Winter View Post
Do you think they will upgrade to a normal 1 TB HDD standard for the base model or is that asking too much?


Oh, in the next model? Depends. I don't know if we'll get Haswell desktops before laptops (we certainly didn't with Ivy Bridge, despite the chips being available first), but looking at the low-end MacBook Pro and the low-end Mac Mini for comparison is a relatively good measure, I guess.

 

If the Haswell 13" MacBook Pro comes with a 750 by default (note the high end has a 750 now), then I imagine the Mac Mini would get a 750, as well. Then again, neither MacBook Pro size has a 1TB by default, but the Mac Mini does. 

 

I wouldn't get my hopes up for 1TB, but I a capacity increase isn't totally out of the cards.

post #360 of 1506
Thread Starter 
750 GB I could see happening and I think is long overdue. Intel HD 4600 across the board? 512 GB single SSD possibility?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › 2014 Mac mini Wishlist