Originally Posted by Marvin
The Ivy Bridge architecture has been out for over a year now and is also just a die-shrink of Sandy Bridge. Haswell is the new architecture. There's a test here that showed a small increase in performance using compiler options with Sandy Bridge but zero and in some cases a downgrade with Ivy Bridge:
Haswell is a new architecture but the stress is on power performance not computational performance. As such it isn't a huge step above Ivy Bridge performance wise. That isn't bad at all though as it gives us Mac Book AIRs that just run circles around last years while running on battery.
If this was Haswell-EP, there would be a possibility of seeing a performance boost from the new architecture. I think the only possibility here is if Apple manages to run the CPU at a higher clock speed due to their cooling solution. I think these scores aren't all that bad though as long as the price points are more reasonable.
It's noticeable; 15 minute jobs go down to 10 minutes. When it comes to Cinebench scores, 50% starts to look like a bigger difference the higher up the scores get because the numbers get further apart. The focus should be on the percentage difference and not the numbers themselves.
For most users the performance should be much better than past hardware.
If Apple could leverage Thunderbolt to chain machines transparently, that would largely make any complaints about lower performance in one machine redundant. All people would have to do is buy 2 or more machines, plug them in and enable compute sharing. If they were able to virtualize the hardware to avoid software license issues, that would be even better but a lot of software has unlimited core licenses.
It will be interesting to see if Apple does anything with clustering. Sadly I think they have abandoned it for good.
It's at least there to encourage BTO purchases of the SSD and GPUs. Rather than buy the entry model and get your own NVidia GPU on the cheap, you have to get Apple's options. The lack of upgradeability will encourage buying new machines too, even if it wasn't intentional. This is from the company that glued the screen on the iMac though so my guess is it was intentional.
I think it is a realization of where technology is taking Apple. We are quickly coming to the point where integration will mean add in GPUs will be a thing of the past. The only machines likely to offer such features are workstations like the Mac Pro and even these machines will suffer from a why bother mentality. If a "Pro" keeps the new Mac Pro 3-4 years trying to upgrade with a new GPU will be silly as you will be putting GPUs into dated hardware.
I think Mac Pro owners have convinced themselves over the years that Apple was giving them special treatment by keeping them upgradeable but they inflated the margins first. By locking down the upgrades, Apple can get better profits that way and that could give them the freedom to hit a lower entry price point. At the very least, I think the new Mac Pros will offer more performance value for the money spent.
They better. As to the so called "pros" out there, I don't think a lot of them really know what they want. They are simpletons that look at what worked for them in the past and can't manage to grasp an improved future.
Ideally both but Intel seems to get better results from core-count for tasks that use all the cores. Clock speed increases probably increase temperatures faster than more cores at lower clocks. Certainly for a number of jobs that use very few cores, CPUs that can be clocked higher will perform better.
The thermal limiting of the many core models is something that 14 nm should deal with fairly well. We might not get more cores but we should at the very least get faster cores.
If AMD keeps racking up losses like they did last quarter, they will be bankrupt soon. They have $3.9b assets, $3.5b liabilities and they made a loss last quarter of $76m.
That is actually damn food for AMD. 76 million sounds like a lot to us grunts working for a wage but for a company the size of AMD it is real close to being in the black.
If their stockholder equity goes below zero and more importantly their cash doesn't cover their bills, the stockholders will either have to finance the company or it will be put up for sale. This is why employees don't always like having large stockholdings in companies, especially ones the size of Apple as it can come with heavy financial responsibility when it does badly.
I can see them moving forward out of this funk but it requires that the economy take off again, which won't happen with the current administration in Washington. It is hard to believe but people have gotten even tighter with money around here, I fully expect the economy to slow even more. This isn't all AMDs fault as even Intel is feeling the pain right now.
AMD's losses do seem to be slowing down but they could be as little as a year away from bankruptcy. Everything is spiralling down, they are cutting marketing, R&D, increasing liabilities, selling property/assets, they outsourced their chip manufacturing to Globalfoundries in 2009 and that comes with its own problems:
The global foundries deal happened a long time ago. It is what AMD is doing now that will either make or break the company. I think they have a chance. Slim maybe but they have a chance. However the big problem is factors outside of their control, the economy, the rise of ARM and with it mobile computing. They have to adapt to these new realities and frankly they are trying.
??? AMD has perfectly good mobile solutions. Apple isn't using them this year, but Apple is just as likely to drop NVidia for the next round of hardware. Beyond that AMD has been very successful with BRAZOS that has handily beat Atom in many design ins.
NVidia's stockholder equity is over 10x AMD's. NVidia actually has enough money to buy AMD. I think they'd be allowed to do that kind of purchase because they are still competing with Intel, who are the market leader. NVidia and AMD together against Intel would surely give them a little hotter competition because NVidia would be able to ship x86 machines to compete with Intel.
In some ways I see NVidia as being on the right track trying to do ARM right. The days of x86 are slowly fading away and frankly I'm not sure Intel can do anything about it. If Apple came out with an ARM based laptop we wold know that Intels days are numbered. AMD has also been making noise about ARM and frankly that looks like a case of seeing the writing on the wall.