or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Leaked document reveals Intel 'Haswell' chips potentially bound for 2013 iMacs
New Posts  All Forums:Forum Nav:

Leaked document reveals Intel 'Haswell' chips potentially bound for 2013 iMacs - Page 2

post #41 of 92
Quote:
Originally Posted by PhilBoogie View Post

Quote:
Originally Posted by ecs View Post

I'd buy such Mini for using the 4core i7 for compiling source code, but I'm not going to put my money on it until they upgrade it to some GPU that won't become obsolete next year.

Compiling source code on a Mini? Really, is that remotely feasible? Or the Mini's that powerful, or has compiling come down to a simple task? I used to support developers of ERP software and always bought the fastest PC's I could. Maxed out RAM, CPU, often over $6k. Okay, this was 1998, so has the Mini caught up on this over the years?

Thanks!

Hi Phil,

My mid-2010 8GB quad-core i7 iMac compiled Blender (3D animation app) for OS X/Cocoa and associated libs in around an hour using the -j8 switch passed to make. I don't know how the latest Mini compares hardware-wise othe than having a different GPU setup, but this might give you a data point. The source was from Git-hub, and the tools were darwinports and X-code 4.x. Hope that helps.
post #42 of 92
Quote:
Originally Posted by PhilBoogie View Post

Makes me wonder if I ever needed a Mac Pro in order to run Aperture. I don't even photoshop my photos, just don't like to wait on my computer to get anything done; I like instant results. But if compiling on a Mini is feasible I will definitely reconsider when my MP dies.
The video card might be the least of my worries, I don't play games so don't need below zero ms response time. Just SSD, a 30" screen.

Back then yes, though as time goes on it will vary but most likely point towards no. The mini keeps taking big strides.
post #43 of 92
Quote:
Originally Posted by PhilBoogie View Post


Makes me wonder if I ever needed a Mac Pro in order to run Aperture. I don't even photoshop my photos, just don't like to wait on my computer to get anything done; I like instant results. But if compiling on a Mini is feasible I will definitely reconsider when my MP dies.
The video card might be the least of my worries, I don't play games so don't need below zero ms response time. Just SSD, a 30" screen.


The biggest achilles heel for me at this point is the gpu. That's an area of heavy requirement growth, as it allows for different workflows. Consider animation where even companies with large workstations available still use low or medium resolution meshes without full textures as proxy rigs to drive the real thing. Games will still use all available gpu power. There are certainly use cases where it's not really appropriate yet. If the intel gpus got to a point of good enough, upgrading every year with a mini could become a viable strategy for a wider range of workflows. You can generally reuse ram through a tick/tock cycle without much of a performance penalty. The speculation over eGPUs and node based hardware from some people on here still shows a lack of comprehension. It's an expensive feature for the lower end machines, and on the higher end machines it would combine a somewhat constrained system of throughput with an expensive base solution. If someone is spending the money on a 27" imac, they might be disappointed by the improvement offered by an external gpu setup. The 680MX still falls significantly short of anything midrange and above when it comes to desktop variants of the same generation, yet you're still paying for that. Adding even a GTX 670 with the cost of the housing throttled over a x4 connection may not produce enough of a gain to be worthwhile. LGA1155 is also fairly lane constrained, so I'm not sure whether you really receive a full 16 lanes of bandwidth.

 

The other thing on gpus is I have to wonder how AMD and NVidia will hold up. I know NVidia tries to deal with high development costs via their high volume products. The onward march of integrated graphics may break that business model, even if NVidia cuts out their graphics card manufacturing partners.

post #44 of 92
Quote:
Originally Posted by Winter View Post

Quote:
Originally Posted by PhilBoogie View Post

Makes me wonder if I ever needed a Mac Pro in order to run Aperture. I don't even photoshop my photos, just don't like to wait on my computer to get anything done; I like instant results. But if compiling on a Mini is feasible I will definitely reconsider when my MP dies.
The video card might be the least of my worries, I don't play games so don't need below zero ms response time. Just SSD, a 30" screen.

Back then yes, though as time goes on it will vary but most likely point towards no. The mini keeps taking big strides.

Welp, if my MP dies or becomes 'too slow' I will take a peek at a Mini, in that case. Maybe need to bring my .aplibrary to the store, but I think they'll allow it (if I first tell them I'm looking for a Mac Pro - lol)
"See her this weekend. You hit it off, come Turkey Day, maybe you can stuff her."
- Roger Sterling
Reply
"See her this weekend. You hit it off, come Turkey Day, maybe you can stuff her."
- Roger Sterling
Reply
post #45 of 92
Yeah for now, your Mac Pro is better than the Mini but in 2013 or 2014, start to take a look.
post #46 of 92
Quote:
Originally Posted by Winter View Post

Yeah for now, your Mac Pro is better than the Mini but in 2013 or 2014, start to take a look.

The machines of 2015 will still have to be judged based on the user and software needs of the time. The need for a MacPro will be no less then than now, software needs and user leveraging continues to increase at a steady rate. Think about the days of Dos or even the Mac Plus, people would go gaga over the newest processor leaks fully believing that that would be the ultimate machine.

Personally I don't think the software industry has even gotten started yet. I foresee a day when something like the Siri AI is built into every Mac OS for one. Well actually far more advanced that today's Siri. Think about true AI built into things like Pages that corrects your documents and actually works with you to produce a document. The engineer in me imagines far more advanced CAD/CAM software that works in realtime. There is just so much than can be done or done better with more powerful hardware I just don't see the need for the Pro ever going away.
post #47 of 92
The mere thought of that makes me want that so much.
post #48 of 92
Quote:
Originally Posted by wizard69 View Post

 The engineer in me imagines far more advanced CAD/CAM software that works in realtime. There is just so much than can be done or done better with more powerful hardware I just don't see the need for the Pro ever going away.

 

This is the kind of thing I've mentioned before. What we do on our machines is heavily influenced by the level of technology at the time where big strides tend to change the way we use software rather than just allow it to run faster. That's a big reason I continually see OpenCL implemented in various programs. It brings a lot of tedious functions to real time.

 

 

 

Quote:
The need for a MacPro will be no less then than now, software needs and user leveraging continues to increase at a steady rate.

That has been a problem for the mac pro. It hasn't maintained steady alignment with some of its past popular markets. Assuming a static price target, the gains haven't always been there on an annual basis. Apple would need to find ways to grab new customers, presumably from Windows or Linux (yes Linux ) systems.


Edited by hmm - 12/15/12 at 10:22pm
post #49 of 92
Quote:
Originally Posted by hmm View Post

Quote:
Originally Posted by wizard69 View Post

 The engineer in me imagines far more advanced CAD/CAM software that works in realtime. There is just so much than can be done or done better with more powerful hardware I just don't see the need for the Pro ever going away.

This is the kind of thing I've mentioned before. What we do on our machines is heavily influenced by the level of technology at the time where big strides tend to change the way we use software rather than just allow it to run faster. That's a big reason I continually see OpenCL implemented in various programs. It brings a lot of tedious functions to real time.
If one has been around long enough you can see how more powerful hardware has dramatically changed what software is capable of. Just take a look at CAD, the industry started out with real clunky 2D, progressed to 3D, then to solid modeling and now we are starting to see systems model whole mechanisms. All of this on the desktop. Each step has changed industry a bit. The biggest change being making such capability available to the designer at his desk and reducing the need for that ivory tower of limited access.
Quote:

Quote:
The need for a MacPro will be no less then than now, software needs and user leveraging continues to increase at a steady rate.
That has been a problem for the mac pro. It hasn't maintained steady alignment with some of its past popular markets. Assuming a static price target, the gains haven't always been there on an annual basis. Apple would need to find ways to grab new customers, presumably from Windows or Linux (yes Linux ) systems.
In some cases that has been Intel fault and in other Apples. The static price target is perhaps Apples biggest mistake with the Pro. The introductory machine is a terrible value now and that rests squarely on Apples shoulders. In a way the Mac Pro is like a fish that has been beached. It has life to flop around a bit but it really needs to get back to its native environment. Like the water, that supports many fish, Apple needs a rational school of Mac Pros to cover the gamut of user needs. Each member of that school needs to support the other members to keep the school itself solid.
post #50 of 92
Quote:
Originally Posted by wizard69 View Post


If one has been around long enough you can see how more powerful hardware has dramatically changed what software is capable of. Just take a look at CAD, the industry started out with real clunky 2D, progressed to 3D, then to solid modeling and now we are starting to see systems model whole mechanisms. All of this on the desktop. Each step has changed industry a bit. The biggest change being making such capability available to the designer at his desk and reducing the need for that ivory tower of limited access.

 

Yeah the goalposts move. With computing it's not just a matter of speed but one of what works at the time. Some of that software can still place a pretty extreme load on gpus with some of the shaders used to check continuity.

 

 

Quote:

In some cases that has been Intel fault and in other Apples. The static price target is perhaps Apples biggest mistake with the Pro. The introductory machine is a terrible value now and that rests squarely on Apples shoulders. In a way the Mac Pro is like a fish that has been beached. It has life to flop around a bit but it really needs to get back to its native environment. Like the water, that supports many fish, Apple needs a rational school of Mac Pros to cover the gamut of user needs. Each member of that school needs to support the other members to keep the school itself solid.

 

The single cpu machine was considered a poor value in 2009, and its evolution since then has obviously been minimal. Workstations Windows side are slightly similar on the gpu end, as you don't have a constant influx of new workstation gpus. Kepler options are just starting to trickle out, and the Quadro 4000 from 2010 or so is still sold today. Typically you can get into a minimum of a 6 core machine with a very nice gpu Windows side for the cost of the one remaining Nehalem configuration. It tapers off somewhat toward the higher end, as dual package stations include higher costs and margins. The raw benchmarks don't really tell everything. Sandy Bridge E gained some new instruction sets, support for SATA III, and some moderate gains even if software doesn't leverage the new instruction sets. With the possible inclusion of usb3 (even without native support), and a current generation gpu, it would have breathed some amount of life into the line. I don't know how intel is going to handle the continued partial perceived misalignment on chip architectures. They can't skip Ivy. Currently their multi processor server products are stuck on Westmere. They produce EX variants on die shrink years, and they wouldn't push a high margin product like that out even further.


They may try to play catch up later, but I don't know exactly what they'll do from that point on.

post #51 of 92
Quote:
Originally Posted by PhilBoogie View Post

Makes me wonder if I ever needed a Mac Pro in order to run Aperture. I don't even photoshop my photos, just don't like to wait on my computer to get anything done; I like instant results. But if compiling on a Mini is feasible I will definitely reconsider when my MP dies.
The video card might be the least of my worries, I don't play games so don't need below zero ms response time. Just SSD, a 30" screen.

What was required a few years ago, and what is required today are different animals. Today's Mac Mini is more powerful that the early PowerMacs, and the first Mac Pro. Unless the requirements for compiling have risen terrifically, a Mini should be fine. Same thing is true for Aperture. And unless you're working on multi 100 meg files in Photoshop with lots do layers and live compositing, it should work fine there as well with 16GB RAM.
post #52 of 92
http://twimages.vr-zone.net/2012/12/haswell_mobile_core-i7.png

I am kind of disappointed about the slight decrease in graphics base MHz but I think that is because I don't understand Haswell even with the Anandtech article and it'll end up being far better for a reason.
post #53 of 92
Originally Posted by Winter View Post
I am kind of disappointed about the slight decrease in graphics base MHz…

 

If anything it's the same thing that happened to processors: clock speed became next to meaningless in the face of cores.

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply

Originally Posted by asdasd

This is Appleinsider. It's all there for you but we can't do it for you.
Reply
post #54 of 92
Makes sense. I know it seems extraneous, though I can't wait for a Haswell mini.
post #55 of 92
Compiling is one of those things that gets almost linear speed ups as you add cores. Faster more capable computers just result in interpreter like design cycles if the machine is well matched to the code base. I think we are a very long way from having systems that are fast enough for every programming task at hand.
Quote:
Originally Posted by melgross View Post

Quote:
Originally Posted by PhilBoogie View Post

Makes me wonder if I ever needed a Mac Pro in order to run Aperture. I don't even photoshop my photos, just don't like to wait on my computer to get anything done; I like instant results. But if compiling on a Mini is feasible I will definitely reconsider when my MP dies.
The video card might be the least of my worries, I don't play games so don't need below zero ms response time. Just SSD, a 30" screen.

What was required a few years ago, and what is required today are different animals.
This is very true but realize that software isn't standing still. You really can't predict what tomorrows hot software product will require hardware wise.
Quote:
Today's Mac Mini is more powerful that the early PowerMacs, and the first Mac Pro. Unless the requirements for compiling have risen terrifically, a Mini should be fine. Same thing is true for Aperture. And unless you're working on multi 100 meg files in Photoshop with lots do layers and live compositing, it should work fine there as well with 16GB RAM.
Well here I have to disagree just a bit. Some people are just more demanding than others and would expect the additional performance more cores could bring. There is no doubt that a Mini can be serviceable with Aperture but many would laugh at the idea of any computer used for photographic work to come without a discrete GPU. The day may come when a discrete GPU might be dismissed as no longer needed but today many Pros would argue we are far from that day.
post #56 of 92
Quote:
Originally Posted by Tallest Skil View Post

Quote:
Originally Posted by Winter View Post

I am kind of disappointed about the slight decrease in graphics base MHz…

If anything it's the same thing that happened to processors: clock speed became next to meaningless in the face of cores.
Clock speed isn't meaningless but it is just one element in determining a machines performance. Many have not noticed but we are now hitting 4GHz in the CPU cores of modern processors. That isn't pedaled now a days like in the past but you can not dismiss that 1.5 extra GHz does impact machine performance. Sure cores help a great deal as modern machines have many threads and processes running at the same time, I just have to object to the idea that clock speed isn't an important factor here. If for nothing else many apps are and likely will be, single threaded or at least not able to leverage threads heavily.

Don't misunderstand me I'd be the first to look for a new machine with four or more cores as I know cores suit my usage patterns well. However if I have a choice between four cores running at 1.2gHZ and four cores running at 3.2GHz I'm going with the 3.2GHz machine all other things being equal.
post #57 of 92
Quote:
Originally Posted by Winter View Post

Makes sense. I know it seems extraneous, though I can't wait for a Haswell mini.
I may try to hold for a Haswell Mini myself. The hope being that the GPUs are a bit less than total crap in the chip. As it is though a four core machine with lots of RAM should do pretty good these days, it is just the lack of respectable 3D that has me concerned about the current Minis.
post #58 of 92
Quote:
Originally Posted by wizard69 View Post
There is no doubt that a Mini can be serviceable with Aperture but many would laugh at the idea of any computer used for photographic work to come without a discrete GPU. The day may come when a discrete GPU might be dismissed as no longer needed but today many Pros would argue we are far from that day.

This is not really true. Dealing with 2D raster images isn't a problem for any modern gpu. Most of them don't scale that well with OpenGL. OpenCL is typically applied to a limited number of functions. Apple and Adobe have both placed it where they can, but a mini with a nice display attached could be arguably superior to an imac there. The mini gets a single thunderbolt port, but usb3 storage options can take care of that. The gpu matters a lot more when you're dealing with 3D stuff such as reprojecting images or stitching spherical images. Most of the 2D image editors have really fallen behind there. With things like Creative Suite, their requirements are heavily based around video ram more and OpenGL version.  Intel should be caught up in that regard by the next revision. If it ships with decent drivers, it would satisfy a lot of professionals in that area. In fact the biggest bottleneck would probably become ram. Creative Suite is extremely ram hungry, especially when dealing with higher bit depths in either stills or footage.

 

Quote:
Originally Posted by wizard69 View Post

I may try to hold for a Haswell Mini myself. The hope being that the GPUs are a bit less than total crap in the chip. As it is though a four core machine with lots of RAM should do pretty good these days, it is just the lack of respectable 3D that has me concerned about the current Minis.

 

As much as I don't consider the mini a great value, it is slowly becoming a better option for a wider range of workloads. That said I'm still disappointed in Apple's workstation offerings. On Windows a decent 6 core (E5-1650) Sandy Bridge E model with SATA III, usb3, and a mid range workstation card like a Quadro 4000 can be found around the price of the base mac pro. I checked several brands. It's mainly when you go to the specialty vendors and "purpose built" hardware that it gets really expensive.

post #59 of 92
Quote:
Originally Posted by hmm View Post

Quote:
Originally Posted by wizard69 View Post

There is no doubt that a Mini can be serviceable with Aperture but many would laugh at the idea of any computer used for photographic work to come without a discrete GPU. The day may come when a discrete GPU might be dismissed as no longer needed but today many Pros would argue we are far from that day.
This is not really true. Dealing with 2D raster images isn't a problem for any modern gpu.
True to an extent but some filter do get GPU acceleration. In any event I don't like the idea of professionals as only photographers so in that regard I probably shouldn't have even used photographers as an example. Professionals by definition is a broader group of people, in that context more professionals still have rational need for a GPU.
Quote:

Most of them don't scale that well with OpenGL. OpenCL is typically applied to a limited number of functions.
This I know, I still have a hard time explaining to people how specific GPU computation is. GPUs in the sense of computation have to fit the problem at hand otherwise there is little benefit.
Quote:
Apple and Adobe have both placed it where they can, but a mini with a nice display attached could be arguably superior to an imac there. The mini gets a single thunderbolt port, but usb3 storage options can take care of that. The gpu matters a lot more when you're dealing with 3D stuff such as reprojecting images or stitching spherical images.
It isn't just 3D, you benefit from a GPU whenever it is faster than integrated.
Quote:
Most of the 2D image editors have really fallen behind there. With things like Creative Suite, their requirements are heavily based around video ram more and OpenGL version.  Intel should be caught up in that regard by the next revision. If it ships with decent drivers, it would satisfy a lot of professionals in that area. In fact the biggest bottleneck would probably become ram. Creative Suite is extremely ram hungry, especially when dealing with higher bit depths in either stills or footage.
Quote:
Originally Posted by wizard69 View Post

I may try to hold for a Haswell Mini myself. The hope being that the GPUs are a bit less than total crap in the chip. As it is though a four core machine with lots of RAM should do pretty good these days, it is just the lack of respectable 3D that has me concerned about the current Minis.

As much as I don't consider the mini a great value, it is slowly becoming a better option for a wider range of workloads.
It is certainly something of mixed value. The frustration is that Apple could make it into a far more interesting machine.
Quote:
That said I'm still disappointed in Apple's workstation offerings.
Disappointed is putting it mildly. I've been waiting at least five years to see Apple rationalize their desktop lineup. In contrast to their laptops the desktops are all around terrible offerings.
Quote:
On Windows a decent 6 core (E5-1650) Sandy Bridge E model with SATA III, usb3, and a mid range workstation card like a Quadro 4000 can be found around the price of the base mac pro. I checked several brands. It's mainly when you go to the specialty vendors and "purpose built" hardware that it gets really expensive.
Yep. The hilarious thing is that the Mac Pro was never really a Pro machine in my estimation. At least not since it was no longer the machine bleeding edge tech was introduced on.
post #60 of 92
Quote:
Originally Posted by wizard69 View Post


True to an extent but some filter do get GPU acceleration. In any event I don't like the idea of professionals as only photographers so in that regard I probably shouldn't have even used photographers as an example. Professionals by definition is a broader group of people, in that context more professionals still have rational need for a GPU.
 

I agree there. If it was Windows I'd just suggest they go for a nice i7, lots of ram, and invest in a quality display to minimize viewing headaches. I extrapolated to graphic designers and light video editing there. A good gpu is ideal. It's just the functions are limited enough to where a mini + nice external display could be an alternative to something like a 27" imac. If haswell/broadwell improve upon this, it could be quite viable for many media based workflows. There's a big jump in price if you outgrow such a machine.

 

 

Quote:
This I know, I still have a hard time explaining to people how specific GPU computation is. GPUs in the sense of computation have to fit the problem at hand otherwise there is little benefit.

 

OpenCL 1.2 and the latest CUDA implementations do support wider ranges of API calls. It'll just take time. Developers typically don't go back and rewrite old workable code most of the time. They're likely to start with the newest features. Note how Adobe went with CUDA on their After Effects raytracer. They've been starting to go toward OpenCL, but I don't think it was ready when they started development.

 

Quote:
It isn't just 3D, you benefit from a GPU whenever it is faster than integrated.
It is certainly something of mixed value. The frustration is that Apple could make it into a far more interesting machine.

I'm probably going in circles here, but with Apple their solutions tend to work extremely well if your requirements are fairly generic. Once you go a bit outside of their design paradigm, things become more difficult. The desire for eGPUs is one area where I feel people are misguided. The products would require both high margins and minimum sales volumes to be viable, especially with potential firmware tweaks and things required to deal with it over an external cable. It would need to be possible to plug one in when the machine is running or recognize the unit if it's accidentally unplugged and must be plugged in again. By the time you're done with it, it would be $600 for a mid range gpu. I don't think Mac users alone would carry that. I could definitely see things headed toward integrated. Right now NVidia is extremely reliant on volume sales to overcome chip fabrication costs on their Quadros, Teslas, and high end gaming cards. They've started to cut out development partners, but they're still having trouble.

 

Quote:
Disappointed is putting it mildly. I've been waiting at least five years to see Apple rationalize their desktop lineup. In contrast to their laptops the desktops are all around terrible offerings.
Yep. The hilarious thing is that the Mac Pro was never really a Pro machine in my estimation. At least not since it was no longer the machine bleeding edge tech was introduced on.

There are a lot of points where I'm confused by their offerings. The Mac Pro is sufficient in a lot of ways. Where I think it's somewhat messed up is on the low end. They employ a lot of cost cutting measures such as the daughterboard design, so that they can use a single backplane type without absorbing the cost of a dual cpu chipset on the single model. The external case design is ancient. Those costs were likely recovered long ago. If it's languishing on volume early in a product cycle, an issue is price on their single configurations, which would carry any kind of volume for the line.

post #61 of 92

You know.  When Apple really wants to do something...they do it.

 

Hell, they did Maps.  I can't think of anything I'd be less interested in as a Mac user.  (...I used to Apple maps to find a restaurant or get close to it...the other night.  Worked just fine.)

 

I never thought they'd do an mp3 player or a phone.  But there's loads of money there by an order of magnitude over a piddling workstation market.  Otherwise, there's plenty they could have done to drive unit sales.

 

The Pro wasn't always over priced.  The g3 Tower/blue and white was much more affordable.  In the bondi and candy era of imac you couldn't stop Apple updating the iMac with various colours and evolutionary spec.

 

It comes to something when they can't be 'arsed' to upgrade the pro.  That 'joke' of a 'new' (ye-ah...) update earlier this year said it all.

 

But hey, iMac owners had been waiting nearly as long in some way and ivy bridge has been out a while.  

 

The mini was the same.  People thought it was dead...and hey, some people are comparing it to the out of date pro.

 

Be nice if Apple offered decent gpus with some of their machines.  4000 intel crappics on the mini?  Really Apple?  Maybe integrated crappics will be less of an apology when haswell's integrated gpu comes along.

 

If you want to play the odd game convincingly, there's nothing I can see under £1095.  If they could bundle a 680 Mx with the top end mini for £795-995 it would make the mini a far more compelling desktop machine.

 

Random rants.

 

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #62 of 92
Oh I would love that and I think many would though you would have that absolutely HIDEOUS power brick and who would ever want that? : P

But seriously, for right now... the only game I want to play on my Mac mini is Gauntlet: Dark Legacy the arcade version through MAME.

I somehow feel even the base iMac could handle it but it would actually be cheaper to rent the arcade machine itself for a few days than to buy an iMac.

In addition, it would have been cool to play Diablo III at decent frame rates (which I know the discrete 2011 Mini can) but Diablo III was nothing like play Diablo I or II.

I am sure discrete graphics were a discussion but I wonder how well 256 or 512 MB of the say nVidia GeForce 640M LE would have worked with a quad-core processor?
post #63 of 92
Quote:
Originally Posted by wizard69 View Post

Compiling is one of those things that gets almost linear speed ups as you add cores. Faster more capable computers just result in interpreter like design cycles if the machine is well matched to the code base. I think we are a very long way from having systems that are fast enough for every programming task at hand.
This is very true but realize that software isn't standing still. You really can't predict what tomorrows hot software product will require hardware wise.
Well here I have to disagree just a bit. Some people are just more demanding than others and would expect the additional performance more cores could bring. There is no doubt that a Mini can be serviceable with Aperture but many would laugh at the idea of any computer used for photographic work to come without a discrete GPU. The day may come when a discrete GPU might be dismissed as no longer needed but today many Pros would argue we are far from that day.

It's a matter of thinking logically when considering a purchase. Of course, not all machines are ideal for all work, and I'm certainly not saying that. This is why Apple, and other manufacturers have different line-ups. But if someone is a developer, for example, and is writing software that isn't a Photoshop, but rather something lighter, and many, if not most developers are, then a well equipped Mini would serve well. If more power is needed, then another would be better, all the way up to a 12 core Mac Pro.
post #64 of 92
Has anybody noticed how badly the site sucks with the iPad mobile version. Not to side track the thread but I just reverted to the desktop version.
Quote:
Originally Posted by hmm View Post

I agree there. If it was Windows I'd just suggest they go for a nice i7, lots of ram, and invest in a quality display to minimize viewing headaches. I extrapolated to graphic designers and light video editing there. A good gpu is ideal. It's just the functions are limited enough to where a mini + nice external display could be an alternative to something like a 27" imac. If haswell/broadwell improve upon this, it could be quite viable for many media based workflows. There's a big jump in price if you outgrow such a machine.
Do understand I was pretty close to set to buy an Ivy Bridge Mini with a discrete GPU. Unfortunately Apple screwed us again. The frustration here is that you can't count on Apple to do the right thing, they seem to be obsessed with milking designs for far longer than they should. In this regard if the Minis power supply can't handle the addition of a GPU it is time for an upgrade. Fatter power supply or fatter box I don't care, the fact is Apple has painted themselves into a corner with the Mini (all desktops really) where they offer a bunch of models with varying costs that don't offer much for those price increases.

As to Haswell it does have the potential to relieve us of the need to desire a GPU in the Mini. Of course that assumes that Apple implements a model that actually offers viable performance.
Quote:


OpenCL 1.2 and the latest CUDA implementations do support wider ranges of API calls. It'll just take time. Developers typically don't go back and rewrite old workable code most of the time. They're likely to start with the newest features. Note how Adobe went with CUDA on their After Effects raytracer. They've been starting to go toward OpenCL, but I don't think it was ready when they started development.
OpenCL is very interesting but it does require GPU support in many cases to deliver 1.2 functionality. Apple certainly got the ball rolling and at this point OpenCL is the industry standard, something Apple doesn't get credit for. OpenCL or not, GPU acceleration is slipping into more and more software and components of the operating system. This is why it is a bit frustrating when people associate a person with gaming when advocating a good GPU, GPUs aren't just about gaming anymore.

As a side note I have to wonder if Apple and the OpenCL group are working on a C++11 version of the standard.
Quote:

I'm probably going in circles here, but with Apple their solutions tend to work extremely well if your requirements are fairly generic. Once you go a bit outside of their design paradigm, things become more difficult. The desire for eGPUs is one area where I feel people are misguided. The products would require both high margins and minimum sales volumes to be viable, especially with potential firmware tweaks and things required to deal with it over an external cable. It would need to be possible to plug one in when the machine is running or recognize the unit if it's accidentally unplugged and must be plugged in again. By the time you're done with it, it would be $600 for a mid range gpu. I don't think Mac users alone would carry that. I could definitely see things headed toward integrated. Right now NVidia is extremely reliant on volume sales to overcome chip fabrication costs on their Quadros, Teslas, and high end gaming cards. They've started to cut out development partners, but they're still having trouble.
Yep the whole idea of external GPUs is a pipe dream. At least with today's TB standard it is, the port isn't fast enough to bother with considering how fast integrated GPUs are scaling. Add in the issues you point out and I doubt we will ever see a viable external TB based GOU solution.
Quote:
There are a lot of points where I'm confused by their offerings. The Mac Pro is sufficient in a lot of ways. Where I think it's somewhat messed up is on the low end. They employ a lot of cost cutting measures such as the daughterboard design, so that they can use a single backplane type without absorbing the cost of a dual cpu chipset on the single model. The external case design is ancient. Those costs were likely recovered long ago. If it's languishing on volume early in a product cycle, an issue is price on their single configurations, which would carry any kind of volume for the line.

Confused isn't the word for it. I can't understand why they have stayed with the current failed line up of desktop hardware for so long. In the US the only machine showing positive growth in sales has been the iMac. That is in no part due to not going for broke design wise for each model node. Let's face it the Mini had evolved over time like the AIR has, it would be far more desirable.

I suspect that many on the management team see the lagging sales of the Mini and Pro as a trend in the industry to laptops. There is no doubt that laptops sell well but I bought my last laptop from Apple because there was no real alternative in the desktop line. That pretty much says it all, Apples desktop line up sucks value wise.
post #65 of 92
Quote:
Originally Posted by Lemon Bon Bon. View Post

You know.  When Apple really wants to do something...they do it.

Hell, they did Maps.  I can't think of anything I'd be less interested in as a Mac user.  (...I used to Apple maps to find a restaurant or get close to it...the other night.  Worked just fine.)
Maps works fine for me also. You have to wonder what is wrong with the complainers.
Quote:
I never thought they'd do an mp3 player or a phone.  But there's loads of money there by an order of magnitude over a piddling workstation market.  Otherwise, there's plenty they could have done to drive unit sales.

The Pro wasn't always over priced.  The g3 Tower/blue and white was much more affordable.  In the bondi and candy era of imac you couldn't stop Apple updating the iMac with various colours and evolutionary spec.
In a nut shell this pricing issue is what really killed the Mac Pro. It drove sales down to the point that only Pros with significant need would buy one leaving the ret to look at Windows or Linux machines. Of course the value in Windows or Linux platforms depends on what type of Pro you are. However the dark side becomes very appealing when the value built into Mac Pros becomes non existent. I mean really why is the Mac Pro being sold right now with what will soon be a 4 year old GPU card. It boggles the mind.
Quote:
It comes to something when they can't be 'arsed' to upgrade the pro.  That 'joke' of a 'new' (ye-ah...) update earlier this year said it all.
It is hard to call a machine a Pro version if it doesn't contain your bleeding edge technology like TB or even USB 3 for that matter. I suspect that update was Apples way of saying we don't sell enough of these a year to care anymore. I'm beginning to wonder if they even come close to 5000 a quarter anymore.
Quote:
But hey, iMac owners had been waiting nearly as long in some way and ivy bridge has been out a while.  
At least the iMac and Mini come with The latest ports, even if the Mini is a major regression otherwise. At least they said here is your update even if it is 10 months late.
Quote:
The mini was the same.  People thought it was dead...and hey, some people are comparing it to the out of date pro.
I don't think the Mini is dead myself, sales are definitely stagnate in the US or where before the last update. Between the Mini and the IMac that is the bulk of Apples desktop sales. I'm to the point that I wonder if the Mac Pro gets even close to two percent of overall Mac sales.
Quote:
Be nice if Apple offered decent gpus with some of their machines.  4000 intel crappics on the mini?  Really Apple?  Maybe integrated crappics will be less of an apology when haswell's integrated gpu comes along.
Yeah Apple what is up with this regression?

I actually think this is symptom of a deep seated problem with Apples desktop engineering. Instead of offering real value in a model spread they try to hard to strive for parts commonality across all Minis. I'm not saying the Mini needs to be an overblown power house when it comes to GPU processing. Rather I'm frustrated with the fact that Apple can't seem to see any value in updating the Minis power supply so that it can drive the extra 15 to 20 watts needed to power a fairly decent GPU that deals with Intels lackluster 3D and OpenCL support.
Quote:
If you want to play the odd game convincingly, there's nothing I can see under £1095.  If they could bundle a 680 Mx with the top end mini for £795-995 it would make the mini a far more compelling desktop machine.

Random rants.

Lemon Bon Bon.

The GPU issue on the Mini is very very frustrating. Before the latest update I was really hoping that Apple would have pulled its heaped out of its ass and offer a real update to the piss poor configuration in the midrange Mini with the discrete GPU. Instead they deleted the damn thing. I don't know what the motivation was but clearly no one at Apple understood how big of a fail it was to offer that Mini with GPU with so little GPU RAM. It is like my god folks ( Apple management) don't you know what the software running on your machines requires these days. I'm certain some fool at Apple headquarters looked at the sales of that midrange Mini and said nobody wants a Mini with a GPU, when the real problem is nobody wants a Mini with a poorly configured discrete GPU. It is another example of Apple shooting themselves in the foot and then wondering why sales of the Mini have been so lackluster. The problem is clear, each price point in a model lineup needs to be able to demonstrate value or advantages for that step up in price. Sadly apple has screwed up considerably here as two more i86 cores don't make up for a good GPU.

I'm really rambling on here but I'm so frustrated with Apples desktop line up that I just need to vent a bit. There is an old say tat goes something like: if you keep doing the same thing over and over again expecting different results you will never get any where. This is what the desktop line up feels like, there is zero innovation here, we just seem to be getting the same pathetic configurations over and over again.
post #66 of 92
Quote:
Originally Posted by Winter View Post

>>>>deleted a bunch
I am sure discrete graphics were a discussion but I wonder how well 256 or 512 MB of the say nVidia GeForce 640M LE would have worked with a quad-core processor?

Hard to say because Apple didn't implement such a machine. However the minimal GPU RAM configuration for much of today's gaming library should be 512MB with 1GB being even more rational. Frankly I'm extremely perplexed with respect to Apples moves on the Mini. It really appears as if they don't have a clue as to what potential customers need out of the Mini. At least not the uprated models. The fact that they seriously offer one model of the Mini as a "server" highlights that marketing has entered the building and all reason has left.
post #67 of 92
I was thinking the best they would be able to do would be the 640M LE being that the regular 640M was included in the base model iMac.

If you're Apple, wizard... what do you do? Do you kill off the server model and replace it with a discrete model? Do you make one integrated and one discrete?
post #68 of 92
Quote:
Originally Posted by Winter View Post

I was thinking the best they would be able to do would be the 640M LE being that the regular 640M was included in the base model iMac.
If you're Apple, wizard... what do you do? Do you kill off the server model and replace it with a discrete model? Do you make one integrated and one discrete?

They could have gone with the same one. Apple was being cheap on the imac, or the extra processing on the display significantly increased costs. 21.5" displays tend to be dirt cheap on their own. Regarding the server model, having that monicker there isn't really important. Such a configuration can easily be supported through cto means. It's just unimportant. If you removed it today, people using it as a home/small business server could still purchase replacements just by opting for a different OS installation and a different hard drive configuration if desired. I can't emphasize how much it is solely marketing. For $800 quad cpu and discrete graphics would have been a better value. People view it as a headless version of the macbook pro, yet it doesn't replace the function of a notebook. Desktops have always offered better price to performance alignment than notebooks. You buy one either because you take it with you at times or you're able to move it room to room effortlessly without shutting down.

Quote:
Originally Posted by wizard69 View Post

Has anybody noticed how badly the site sucks with the iPad mobile version. Not to side track the thread but I just reverted to the desktop version.
Do understand I was pretty close to set to buy an Ivy Bridge Mini with a discrete GPU.

Marvin said the same thing regarding a discrete gpu ivy mini.  I've seen this pattern to their moves before. They went with something they considered good enough for one generation while betting on the next being an improvement. I always look up exactly what leverages the gpu in a given application, so that if anyone asks for a suggestion I can actually include machines like the mini in the comparison. If it hit the point of good enough gpu, a mini with more frequent replacement could make sense for some of the higher end desktop/xmac type workloads. It's still somewhat IO bottlenecked. If thunderbolt had a reasonably robust array of devices and the machine had a second port of this type, it wouldn't be so bad. USB3 would work for some people. Thunderbolt as it is right now is really going to be tied up by the display as most quality displays use displayport connections, and not all of them will function correctly at the end of a chain. Some manufacturers publish compliance matrices on a variety of their offerings. Some of those came out a few years ago, thus the DVI and dual link DVI where such requirements could be met by displayport today. They include mini displayport on their latest models,  just (understandably) not thunderbolt.

post #69 of 92
I want to get into specifics though. What would be feasible with the mobile quad-core? 640M LE? 640M? 650M?
post #70 of 92
Quote:
Originally Posted by Winter View Post

I want to get into specifics though. What would be feasible with the mobile quad-core? 640M LE? 640M? 650M?

The 650M is 45W, the 640M is around 30W and the 640M LE 20W. The CPU is 45W. They don't have to allow both to run at full power all the time so they could fit a faster GPU in but the power supply limit is 85W and there needs to be some allowance for bus-powered devices, drives, cooling fan etc. Say 30W for that stuff, leaves you about 55W so you only have about 10W on top of the 45W CPU and ideally you don't want to max it out. Like I say, they can scale the CPU down while the GPU is running but a 650M is probably pushing it. A 640M should be doable.

The Retina Macbook Pro has a 95-watt-hour battery and a 650M and it lasts 1.5h while gaming so the parts + display draw 63W. Given that the Mini has no display, maybe a 650M could go in but that power still turns into heat and has to be dissipated. If you up the power by 40%, they'd probably have to run the fans pretty fast to get it to cool down. They could design it differently and separate the two chips more and draw cool air in from one side and put it out the other but I'd say the 640M is as good it as it can go.

They probably weighed up the benefit of having a 60% faster GPU against the engineering and cost and decided it wasn't worth adding. Coupled with the fact Intel seem to be getting their act together in the GPU dept now, it makes sense to go the IGP route. Plus, if they make the Mini too good, you'd be less inclined to go for the desktop they want you to buy.
post #71 of 92
Quote:
Originally Posted by Marvin View Post



The Retina Macbook Pro has a 95-watt-hour battery and a 650M and it lasts 1.5h while gaming so the parts + display draw 63W. Given that the Mini has no display, maybe a 650M could go in but that power still turns into heat and has to be dissipated. If you up the power by 40%, they'd probably have to run the fans pretty fast to get it to cool down. They could design it differently and separate the two chips more and draw cool air in from one side and put it out the other but I'd say the 640M is as good it as it can go.
 

I think you were the one that mentioned the current mini still has their air intake and outtake in a near identical spot, with the power supply moved to an internal version. Apple tends to remove things early, even if they're still needed. If it went away this year, I don't see it coming back. The last one was only good for games. For anything gpu leveraged outside of games, it was memory starved. Today there's still no true replacement for desktop level graphics. I noted in the other thread that the 680MX is at a disadvantage if you're comparing floating point math to Fermi.

 

 

Quote:
They probably weighed up the benefit of having a 60% faster GPU against the engineering and cost and decided it wasn't worth adding. Coupled with the fact Intel seem to be getting their act together in the GPU dept now, it makes sense to go the IGP route. Plus, if they make the Mini too good, you'd be less inclined to go for the desktop they want you to buy.

They're getting to a point where it may be passable assuming decent drivers.

post #72 of 92
Unfortunately, which I brought up in another topic, Apple doesn't exactly make the iMac very desirable to buy either except for ultimate model due to the lack of a VRAM upgrade option.

I agree with you though Intel is moving in the right direction and I am excited to see what is in store in the forthcoming year.
post #73 of 92

Kinda upset that both the 21.5's only offer 512mb of video ram. Should be at least 1gb...

post #74 of 92
Quote:
Originally Posted by Winter View Post

I was thinking the best they would be able to do would be the 640M LE being that the regular 640M was included in the base model iMac.
I don't buy into the idea that anybody looking to buy a Mini would also be interested in the iMac. So it really doesn't matter what GPU is in the Mini. It is pretty apparent Apple doesn't see it that way thus the constantly castrated Mini. In this regard Apple is at their most stupid marketing wise.
Quote:
If you're Apple, wizard... what do you do? Do you kill off the server model and replace it with a discrete model? Do you make one integrated and one discrete?
This is complex set of questions to answer.

First; Mac OS at its core is UNIX, a BSD version in this case. So any Mac can easily be set up for server duty. The idea that a specific hardware variant of the Mini is a server and the others aren't is just bogus.

Second; while I know many make use of the Minis as servers they are less than ideal for more demanding server duties. If for nothing else the poor access to the disk drives is a significant issue. Since any Mini can be used for ligh duty server duties and nothing is suitable for heavy use there is little sense in a designated server Mini.

Third; I believe there is good reason for a Mini with a discrete GPU, at least until integrated GPUs provide similar OpenCL and 3D performance. Frankly it could be another two or three years until we have serviceable 3D support from Intel. The hedge here is that there are rumors about very powerful GPUs coming in Haswell even though the list here does not show them. So what I see a need for is a "Mini" with an upgraded powersupply to drive a discrete GPU and memory array of rational performance.

To sum it up, Apple still needs a mid range machine with a discrete GPU. If they don't want to do that in the Mini then they need another machine.
post #75 of 92
Quote:
Originally Posted by hmm View Post

They could have gone with the same one. Apple was being cheap on the imac, or the extra processing on the display significantly increased costs. 21.5" displays tend to be dirt cheap on their own. Regarding the server model, having that monicker there isn't really important. Such a configuration can easily be supported through cto means. It's just unimportant. If you removed it today, people using it as a home/small business server could still purchase replacements just by opting for a different OS installation and a different hard drive configuration if desired. I can't emphasize how much it is solely marketing.
That is a good way to put it, the server Mini is just a marketing ploy.
Quote:
For $800 quad cpu and discrete graphics would have been a better value. People view it as a headless version of the macbook pro, yet it doesn't replace the function of a notebook. Desktops have always offered better price to performance alignment than notebooks. You buy one either because you take it with you at times or you're able to move it room to room effortlessly without shutting down.
Marvin said the same thing regarding a discrete gpu ivy mini.  I've seen this pattern to their moves before. They went with something they considered good enough for one generation while betting on the next being an improvement.
This appears to be their MO. The thing is it sucks for the customer. Further they never screw up the laptop line with such silliness.
Quote:
I always look up exactly what leverages the gpu in a given application, so that if anyone asks for a suggestion I can actually include machines like the mini in the comparison. If it hit the point of good enough gpu, a mini with more frequent replacement could make sense for some of the higher end desktop/xmac type workloads. It's still somewhat IO bottlenecked. If thunderbolt had a reasonably robust array of devices and the machine had a second port of this type, it wouldn't be so bad. USB3 would work for some people.
The combination of both USB3 and TB has changed the viability of the Mini in a very positive way. However you still have that bandwidth constrained mobile chipset to deal with.
Quote:
Thunderbolt as it is right now is really going to be tied up by the display as most quality displays use displayport connections, and not all of them will function correctly at the end of a chain. Some manufacturers publish compliance matrices on a variety of their offerings. Some of those came out a few years ago, thus the DVI and dual link DVI where such requirements could be met by displayport today. They include mini displayport on their latest models,  just (understandably) not thunderbolt.
On the flip side any old monitor would work for a server thus no possible restriction on the TB port.
post #76 of 92
Quote:
Originally Posted by AandcMedia View Post

Kinda upset that both the 21.5's only offer 512mb of video ram. Should be at least 1gb...

This is a massive problem for Apple, they really can't seem to grasp that people (customers!) want good video subsystems. Well at least the option of better systems with proper VRAM.
post #77 of 92
Maybe things will change as time goes on in the Tim Cook era. I wonder if there is a specific group that decides what goes into what. They have to run tests for certain kinds of things.
post #78 of 92
Quote:
Originally Posted by wizard69 View Post


That is a good way to put it, the server Mini is just a marketing ploy.

I don't think the idea of the mini as a server originated with Apple. They get a lot of interesting ideas from external cooperative sources at times. It may have originated with customer use cases. iOS has also seemingly taken ideas from Cydia at times, not that this is a bad thing.

Quote:

The combination of both USB3 and TB has changed the viability of the Mini in a very positive way. However you still have that bandwidth constrained mobile chipset to deal with.

 

The typical desktop chipset is also technically constrained. LGA1155 only has 16 real lanes. Its supported board configurations are over-subscribed on lane count. The Xeon ENs have 20 lanes. EP or Sandy Bridge E gives you 40 or 80 on the dual package models. It's quite a boost. A second thunderbolt port on the mini would still be useful, given that the thunderbolt chip used supports a maximum of 2. It would allow for the use of any given displayport display. HDMI has been problematic with the HD 4000 gpu used by the Mini on both Windows and OSX. Beside that displayport is a better standard anyway. I would call the added flexibility a good thing if thunderbolt turns out any decent peripherals in the near future.

 

Quote:
On the flip side any old monitor would work for a server thus no possible restriction on the TB port.

A server may not even need a display, depending on if it can be managed through one of its client machines. Now I want an OSX hypervisor, because that would be cool.

post #79 of 92
Quote:
Originally Posted by hmm View Post

I don't think the idea of the mini as a server originated with Apple. They get a lot of interesting ideas from external cooperative sources at times. It may have originated with customer use cases. iOS has also seemingly taken ideas from Cydia at times, not that this is a bad thing.
This may very well be the case, Minis as server have gotten a lot of press in the past. My only point here is that there is nothing really special about the Mini "server" relative to other Minis. It really is a case of marketing.
Quote:
The typical desktop chipset is also technically constrained. LGA1155 only has 16 real lanes. Its supported board configurations are over-subscribed on lane count. The Xeon ENs have 20 lanes. EP or Sandy Bridge E gives you 40 or 80 on the dual package models. It's quite a boost.
That is sort of the point, you can't go much lower performance wise for a server.
Quote:
A second thunderbolt port on the mini would still be useful, given that the thunderbolt chip used supports a maximum of 2. It would allow for the use of any given displayport display. HDMI has been problematic with the HD 4000 gpu used by the Mini on both Windows and OSX.
Hopefully Intel will fix their driver issues. I agree a second TB port could be useful on the Mini but honestly you can't give up the USB ports. So Apple would need to find a way to squeeze everything into the back.

Actually they would make me very happy by placing one or two USB ports to the front of the machine. I never like the design over functionality sickness at Apple. Accessible USB ports will be viable for ages.
Quote:

Beside that displayport is a better standard anyway. I would call the added flexibility a good thing if thunderbolt turns out any decent peripherals in the near future.
There are already some nice TB devices out there. Some of them could make for a nice Mini based server.
Quote:
A server may not even need a display, depending on if it can be managed through one of its client machines. Now I want an OSX hypervisor, because that would be cool.
That would be cool!

As for TB I wish I had the resources as there are a few ideas floating around in my head that could really leverage TB.
post #80 of 92
I
Quote:
Originally Posted by Winter View Post

Maybe things will change as time goes on in the Tim Cook era. I wonder if there is a specific group that decides what goes into what. They have to run tests for certain kinds of things.

I'd like to believe that but I think thing will get worst for the desktop. Apple will follow the money and frankly the lack of attention to the desktop market will result in a self fulfilling prophecy where the lack of investment is justified based on poor sales and the poor sales are the result of the lack of investment. This is exactly what s happening with Pro sales and to a lesser extent Mini sales. It is obvious they aren't getting the engineering resources the laptops and iOS devices are getting.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Leaked document reveals Intel 'Haswell' chips potentially bound for 2013 iMacs