Apple rumored to switch to Nvidia graphics for new MacBook Pros

24

Comments

  • Reply 21 of 63
    nvidia2008nvidia2008 Posts: 9,262member


    At this stage as well, the Ivy Bridge CPU and GPU benchmarks are literally off-the-charts for anything ever conceived in an on-die CPU+GPU:


    http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review


     


    Crysis Warhead, Metro 2033, Dirt 3 at playable frame rates by Intel Integrated Graphics is nothing short of unbelievable given how bad they used to be.


     


    I think at this stage only the high-end MBP 15" and high-end MBP 17" Ivy Bridge MBPs will have discrete GPUs.


     


    I mean, Ivy Bridge appears to clearly outpace the ATI 5450, the staple low-but-not-horrible-end class of discrete GPU.


     


    If Intel can ramp production suitably, Ivy Bridge is pretty much the nail in the coffin for discrete GPUs, and AMD will struggle in the mid to higher prices.


     


    Given also that while games are useful for benchmarking the integrated GPU, for most mainstream use, if driver support is adequate then it's more than enough.


     


    We're talking 30%-40% improvements over Sandy Bridge. Imagine, playing a DX11 game on an Intel Integrated GPU... albeit at 720p... but still... this is big.


     


    And if you look at general compute of the ~GPU alone~ for Ivy Bridge, pretty much game over for AMD and Nvidia in the mid- to high-priced mainstream-use laptops:


     


    45906.png


     


    You buy one chip. You want fast CPU? It's in that one chip? You want a reasonable GPU for graphics and compute? It's in the same one chip. You want optimised video encoding? Also in the same chip.


     


    Game over.

  • Reply 22 of 63
    gyorpbgyorpb Posts: 93member

    Quote:

    Originally Posted by kpluck View Post


    The 320M is a more powerful GPU than Intels HD3000. It is only the better CPU in the newer MacBook Pro 13" that makes them faster. From a GPU standpoint, the current 13" was a downgrade.


     


    While a single game is far from a perfect benchmark, it is not hard to find other tests that back up that up.





    Indeed. Not only is the Intel GPU less capable, it is also buggy. Running the Pixel City screensaver on the 2010 MacBook Air was smooth as can be, on the 2011 model, the fans kick into high gear after  a short while and there are always artifacts on the screen. That was never a problem with the 2010 model.


     


    .tsooJ

  • Reply 23 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by gyorpb View Post




    Indeed. Not only is the Intel GPU less capable, it is also buggy. Running the Pixel City screensaver on the 2010 MacBook Air was smooth as can be, on the 2011 model, the fans kick into high gear after  a short while and there are always artifacts on the screen. That was never a problem with the 2010 model.


     


    .tsooJ



     


    Well, it is quite nice, but on my MBP with 320M, it's running at 15fps on an external 1680x1050 monitor, so, the point is fairly moot. On MBP 13" screen natively it's not too bad. Again, if anyone has OpenCL examples of great GPU-utilising apps, that would be nice.


     


    Also if Intel's drivers improve, the Ivy Bridge GPU will make most discrete GPUs unnecessary. On Windows at least based on Anandtech's testing game compatibility and drivers for the most part seem quite alright and the Ivy Bridge GPU benches quite well against discrete cards. Nothing to suddenly turn the tide against discrete GPUs for enthusiast PC gaming, but for everything else, definitely something interesting happening with Ivy Bridge.


     


    Drivers are a fair point, so I hope Apple and Intel get it right with Ivy Bridge, because they will be the basis of the fastest Mac laptops ever made, on the best and most elegant laptop platform ever created (think MacBook Air-esque design coming to a pro-level 15" laptop).

  • Reply 24 of 63
    eauviveeauvive Posts: 237member


    The fact the NVidia has won the race for the next MacBook Pro round with its Fermi architecture is known since the beginning of the year.


    From a GPGPU standpoint, Fermi is going to provide extended precision floating point capacity (80-bit).


    But I have also heard that NVidia is going drop support of OpenCL in favor of CUDA.


    All of that remains to be confirmed.

  • Reply 25 of 63
    MarvinMarvin Posts: 15,326moderator
    doh123 wrote: »
    after the 8600 though, Apple still used several others... The 9400s, 9600s, 320m, 330m were all after that and all didn't have the issues of the 8600.

    The 320M was a great IGP. One thing I don't like about AMD GPUs is they run very hot. The X1900XT cards were problematic in the old Mac Pros. Their drivers are also not as good as NVidia's.

    Someone benchmarked the NVidia 650M in an Alienware laptop here:

    http://forum.notebookreview.com/alienware-m14x/661207-650m-preliminary-benchmarks-up-67-faster-than-555m.html

    The fastest MBP GPU, the 6770M gets 23fps in Metro 2033 on high, the 650M is at 35fps. These benchmarks vary a lot but one interesting comment was that the GPU didn't go over 60 degrees during the test, presumably even after overclocking. This was in beside a quad-core i7.

    The 6770M on the other hand can go a fair bit higher on stock:

    https://discussions.apple.com/thread/3633821?start=0&tstart=0

    A number of separate reports online say 80-90 degrees. It makes sense to go with a cool chip if you have a smaller enclosure. There's going to be more room without the optical but the cooler the better.

    Here's the entire NVidia lineup:

    http://www.geforce.com/whats-new/articles/geforce-600m-notebooks-efficient-and-powerful/

    Even the 640M gets 30FPS in Metro 2033 on high.

    I could see them using the 630M in the middle Mini, the 640M in the entry 15" and the 650M in the high-end MBPs. There is a GTX 680M coming on June 5th that would be suitable for the 27" iMac (might explain the delay):

    http://www.legitreviews.com/news/13130/

    Looks like a full refresh at WWDC.
  • Reply 26 of 63
    asciiascii Posts: 5,936member

    Quote:

    Originally Posted by nvidia2008 View Post


     


    Game over.



     


    But computer monitors are about to quadruple in pixels (hopefully starting with Macs in the next few weeks).


     


    And 3D graphics are hardly lifelike yet. I think these discrete video card makers could create demand for their products by inventing and giving away new algorithms for e.g. lifelike movement, lifelike skin, lifelike trees, that require the compute power of their devices. Make is so that people aren't just comparing FPS, but that things look totally different altogether on an expensive card.

  • Reply 27 of 63
    gyorpbgyorpb Posts: 93member

    Quote:


    Originally Posted by nvidia2008 View Post


     


    Also if Intel's drivers improve, the Ivy Bridge GPU will make most discrete GPUs unnecessary. On Windows at least based on Anandtech's testing game compatibility and drivers for the most part seem quite alright and the Ivy Bridge GPU benches quite well against discrete cards. Nothing to suddenly turn the tide against discrete GPUs for enthusiast PC gaming, but for everything else, definitely something interesting happening with Ivy Bridge.


     


    Drivers are a fair point, so I hope Apple and Intel get it right with Ivy Bridge, because they will be the basis of the fastest Mac laptops ever made, on the best and most elegant laptop platform ever created (think MacBook Air-esque design coming to a pro-level 15" laptop).



    If Intel's drivers improve, indeed. Not when.


     


    And rather than just deliver functional driver with new hardware, how about fixing the existing ones? I'm not about about to write off a six-month-old MacBook Air, simply because Intel decided to move on to the Next Big Thing. Granted, I never had issues in everyday use of the MacBook Air, but the rendering errors in the Pixel City screen saver are plantiful, embarrasing and telling.


     


    .tsooJ

  • Reply 28 of 63
    tokenusertokenuser Posts: 69member


    CUDA. 4 little letters that make the scientific computing community want Nvidia rather than OpenCL on another platform.


    No, you won't find that especially important in the App store, or in the latest games (because the GPU is already busy), but for the PRO wanting/using an MBP it is a killer spec.

  • Reply 29 of 63
    haarhaar Posts: 563member
    if this is true, then it wud be a decision that might come back to burn Apple due the 8600 gpu fiasco... i hope not because (while poor form) the Late Steve Jobs would have outright said no... (burned once, forever forgotten).
    I only reason that they would be going with Nvidia would be due to Nvidias faster drivers, and that Nvidia could supply enough for the laptops. Perhaps this is the reason that GTX680 and GTX670 cannit be found for purchase on NEWEGG or elsewhere...
  • Reply 30 of 63
    tipootipoo Posts: 1,142member


    The GT 650M was found in the developer software. I'd say that chip is a good bet, significantly higher 3Dmark Vantage score than todays MBPs stock GPUs. And it's a true Kepler, not a re-bradge





    http://www.geforce.com/hardware/desktop-gpus/geforce-gt-650m



     


    600m-lineup.png

  • Reply 31 of 63
    ssquirrelssquirrel Posts: 1,196member

    Quote:

    Originally Posted by haar View Post

    i hope not because (while poor form) the Late Steve Jobs would have outright said no... (burned once, forever forgotten).

     


     


    Which is why after Microsoft ripped off Mac OS while they were working on software for the new Mac OS, Apple made sure to never again do business with Microsoft.  Oh wait...  :)  Yeah, I don't think reality is what you think it is.

  • Reply 32 of 63
    nhtnht Posts: 4,522member

    Quote:

    Originally Posted by Marvin View Post



    I could see them using the 630M in the middle Mini, the 640M in the entry 15" and the 650M in the high-end MBPs. There is a GTX 680M coming on June 5th that would be suitable for the 27" iMac (might explain the delay):

    http://www.legitreviews.com/news/13130/

    Looks like a full refresh at WWDC.


     


    A 630M in a refreshed mini this summer would be very nice.

  • Reply 33 of 63
    charlitunacharlituna Posts: 7,217member

    Quote:

    Originally Posted by Tallest Skil View Post


    I'm still not sure I trust nVidia after the 8600M fiasco.



     


    I was thinking the same thing. 


     


    That and since when was ABC News a sure thing for knowing about any tech much less Apple. Something tells me we might find out that their trusted source is Digitimes

  • Reply 34 of 63
    wizard69wizard69 Posts: 13,377member
    hmm wrote: »
    In laptops maybe, and not by much. Where are you getting your information?
    A few watts here and there make for a big difference in battery life. Beyond that AMD has been producing their latest series of laptop GPUs for a couple of months now. Producing, notably, without all the rumored reliability issues NVidia has been having.
    You are asking the wrong questions. You need to do some research because it depends on the application and how you will be using it. We're well past the days of what computer do I need for X application. I wouldn't worry about the computer until you're close to graduation anyway, as requirements will change in that time.
    More importantly computers will change dramatically in that time. I suspect though that his Mac Book may be underpowered for the educational uses he has lined up for it.
    I disagree. OpenCL has seen some nice adoption, and it's extremely useful in many more things today. More people benefit from a strong gpu and OpenCL support today than would have a few years ago.
    OpenCL has been one of Apples greatest success stories. Even Adobe is switching over to it. As to GPU acceleration in general, each iteration of the Mac OS continues to leverage the GPU to a greater extent. I'm not sure why the significance of OpenCL is so underestimated in these forums.
    Is this year's AMD significantly better? Mobile gpus were NVidia's weakest point.  It's a troll rumor anyway. Remember a few months ago? We had the same rumor, then a rumor they couldn't satisfy Apple, now another rumor of NVidia, all from the same source. Laptops debuting later than initially expected by the masses = many many fabricated rumors.
    I would have to say yes, AMD is better. Their GPUs run at lower power, are better OpenCL machines and have not suffered from reliability problems like NVidia. AMDs drivers "MAY" be less reliable but much of the noise with respect to AMD drivers are the result of issues many years old. That doesn't mean they are in anyway perfect, driver wise, but NVidia hasn't exactly been wonderful on the Mac platform. Besides it is hard to tell who is responsible for what at Apple. Remember Apple is the company that is way behind with OpenGL support and other features.

    NVidia may be the new play at Apple but I'm not convinced that they are worth it. Maybe for niche uses NVidia has advantages but for all around usage I'd prefer AMD.
  • Reply 35 of 63
    heffequeheffeque Posts: 139member

    Quote:

    Originally Posted by Tallest Skil View Post


    I'm still not sure I trust nVidia after the 8600M fiasco.



    Tell me about it. Mine died a couple months ago. I didn't have money for a replacement or for a new Mac, so I bought a Zotac AD04 and... I just found out a few weeks ago that I could have had it fixed for free in an Apple Store. FFFFFFFFFFFFUUUUUUUUUUUUUUUUU!!!!!!!!!!!!!!!!!

  • Reply 36 of 63
    solipsismxsolipsismx Posts: 19,566member
    heffeque wrote: »
    Tell me about it. Mine died a couple months ago. I didn't have money for a replacement or for a new Mac, so I bought a Zotac AD04 and... I just found out a few weeks ago that I could have had it fixed for free in an Apple Store. FFFFFFFFFFFFUUUUUUUUUUUUUUUUU!!!!!!!!!!!!!!!!!
    Do you still have the Mac? Did we learn a lesson?
  • Reply 37 of 63
    wizard69wizard69 Posts: 13,377member
    nvidia2008 wrote: »
    Interesting, name me 5 OpenCL OSX 10.7 apps on the Mac App Store that qualify for a decent, mainstream Mac experience. Core Image doesn't count, has to be mostly OpenCL, that is, the app calls the OpenCL stuff directly not OpenCL indirectly by calling to Core Image only.
    Talk about trying to limit the discussion in your favor! Since when does the Mac App Store cater to high performance apps? Further do you really expect people to go out and do such research for you?
    "Analog" on the Mac App Store is really cool, but it's using my GPU for Core Image calls to do the filtering, and it's laggy at times, if it used pure CPU power on a higher-end Sandy Bridge MBP, it might be faster.
    "might be faster". That is a real firm position to take. As for Sandy Bridge being faster for core image that is very doubtful, given that the SB chip is coupled with a modern GPU.
    And with Ivy Bridge just around the corner, the case for discrete GPUs given the sheer power of Ivy Bridge CPU components is diminishing fast for ~mainstream laptop~ computing.
    For a low end laptop maybe. For a mainstream machine it is much more a mixed bag. Further with HiDPI we might actually see a performance regression. Here is the reality, the Ivy Bridge GPU still sucks, it doesn't even out perform AMDs year old APU.
    As far back as 2005, desktop sales jumped off a cliff and that was the point they took discrete GPUs with them.

    I don't deny that descrete GPUs become harder to justify with each iteration of compute hardware. However Integrated GPUs (Ivy Bridge & Trinity) are a long way from providing replacement functionality in the likes of the MBPs. That will likely change in a couple of years but right now we still need descrete GPUs.

    As to OpenCL I'm not sure why you constantly poo poo it. OpenCL is perhaps one of Apples greatest success stories from a developer perspective. The movement to OpenCL has taken place throughout the industry as it is the best open solution out there to leverage computing resources often found in GPUs. I really see no basis for your position.
  • Reply 38 of 63
    lukeskymaclukeskymac Posts: 506member


    nvidia2008, you need your head checked.


     


     


    Quote:

    Originally Posted by nvidia2008 View Post


     


     


     


    In any case the discrete GPU industry is on its way out. Only certain MBP 15" and all MBP 17" - esque MBPs will have discrete GPUs. AMD and Nvidia blew it, and Intel steamrolled them, legally and illegally (eg. locking out Nvidia). In the meantime PowerVR and ARM is eating everyone's breakfast, lunch and soon, dinner.


     


    Are you from the future? 2016? Because as far as I know, that's the only scenario where such a statement could possibly be true.


     


    As I said before, my username is perhaps the peak of discrete GPUs (Even then the 8600M fiasco was quite bad, though that G92 GPU design was superb).


     


    I have the famed Nvidia 320M in my MBP 13" 2010. It's okay, but compared to a integrated Intel MBP 13" nothing great.


     


    Not only that's incorrect (it's better than Sandy Bridge graphics), but even if it wasn't, that'd be nothing more than the expected, seeing as that GPU was really low-end and that HD3000 is, oh I don't know, a generation newer?


     


    So many laptop discreet GPUs in any case are such crippled versions of their desktop brethren the difference between them and Intel is nothing more than marketing. 2GB VRAM on a useless laptop discrete GPU? That's like those $300 HDMI cables.


     


    Bullshit, and any quick look at Wikipedia or benchmarks at AnandTech proves it.


     


    Throw in the whole "casual gaming" phenomenon and that's the killing blow to the discrete GPU industry.


     


    Casual gaming is not going to kill AAA titles. That's simply a thoughtless assertion.


     


    Consider this: just at the time when GPUs became ever more hot, heavy, expensive and noisy to support ever more complex, risky and expensive (software and hardware) games, people gravitated towards simpler, alternative "casual" games. The "perfect storm" that crushed almost all discrete GPU dreams.


     




    BULLSHIT. Double one, at that:



    1) GPUs are getting more and more power efficient and smaller. Look at the size of a Nvidia GTX 670.

    2) People are NOT "gravitating towards simpler games", it's people who never played games that are being drawn in to casual gaming thanks to smartphones. They'd have never spent money on a gaming gadget otherwise - casual or not - and the few exceptions to that rule are ex-Nintendo DS users, which don't fit in with the market we're talking about.


     


    Also, I enjoy how Nintendo's suffering with all this. Maybe then they'll stop selling gimmickry outdated hardware that sports only one or two quality titles.


     


    The only decent GPUs in the future will be:


     


    1. Megalithic desktop 500W-1KW multi-card setups, ie. niche stuff Nope


     


    2a. Intel Integrated which will benefit from modest improvements in GPU architecture but huge gains in CPU power (ie. GPGPU not so essential because CPU still handles a lot of tasks, including custom routines for video encoding said to be the forte of GPUs, which is now bollocks)


     


    2b. Intel Integrated which will benefit a lot from process improvements which currently outpace anything TSMC/ AMD/ Nvidia can achieve


     


    3. PowerVR which will come in from the ground up, ie. iPad 3/4/5 GPU is the next great gaming GPU.


     


    So on one hand we have niche "high-perfomance" stuff that has no real world mainstream application, aging gaming consoles with now very paltry graphics, "next-gen" gaming consoles which are still dicey in terms of business "models", Intel Integrated which is sufficient for mainstream computing but nowhere near gaming-class,


     


    And on the other hand... iPad. 'Nuff said.



    You're implying that AAA titles will vanish and that the future of gaming is based on Angry Birds lookalikes running on iPads?



    I feel an urge to call names... Let's just settle on "THAT'S JUST INANE BS"


     



     


    Quote:

    Originally Posted by nvidia2008 View Post


     


    On a final note, who needs a beyond-average GPU in a laptop anyway? Only certain niches (aka "verticals"). Gaming has potential but Windows is nonsense compared to the simplicity and tradeability of Xbox, PS and Wii, and Mac titles are as always, progressing but still laughable.


     


    Oh scratch that... Angry Birds on iPads... AND CONSOLES? Considering the rumored specs for the "next-gen" are based on last-year's mid-end standard GPUs, many notebooks will potentially outperform these new consoles at launch. You know who you can think for that?



    Discrete GPUs.


     



     


    Quote:

    Originally Posted by nvidia2008 View Post


     


    While that is true, that's also the spectre discrete GPUs face. The 13" was a downgrade no doubt, but how many people "suffered" as a result? Since CPU power massively improved, the GPU downgrade was not really felt (aside from high-end games, which is rare for Mac users).


     


    I didn't buy a MacBook last year because it's insane that a +$1000 doesn't have a goddamned real GPU. I own the same MBP you do.


     


    Even for games, given the way the ports are done as well, Pyschonauts is unplayable on my 320M even though it's not Intel, and even though that game is, well, very, very old.


     


    So while I enjoy the 320M's advantages in say, OpenGL Photoshop where zoom levels are antialiased/sampled properly, increasingly Intel GPUs can do this.


     


    iPhoto and iMovie can be GPU-intensive to leverage the GPU, but on my MBP 13" 320M Core 2 Duo, using the GPU for certain iPhoto and iMovie tasks is ~slower~ compared to Sandy Bridge.


     


    So there's this one task in which Sandy Bridge uses both CPU and GPU and the computer that uses just the GPU that's one year older LOSES? SHOCKING.


     


    I guess I'm proposing that the promise of discrete GPUs has been obliterated by poor real-world implementation.


     


    Poorly implemented by Apple, you mean.


     


    The discrete GPU companies have also now painted themselves into a corner:


     


    How is this 100W-200W+ behemoth ever going to benefit our new mobile, tablet, sleek and slim lifestyle while still pushing ever greater interactive experiences?


     


    GeForce_GTX_670_F-1_575px.jpg


     


    I could put a picture of an hexacore Intel Extreme i7 and ask the same thing. If we're comparing apples to oranges...




     


    Quote:

    Originally Posted by nvidia2008 View Post


    At this stage as well, the Ivy Bridge CPU and GPU benchmarks are literally off-the-charts for anything ever conceived in an on-die CPU+GPU:


    http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review


     


    Not impressive considering how on-die GPUs have always been unusable.


     


    Crysis Warhead, Metro 2033, Dirt 3 at playable frame rates by Intel Integrated Graphics is nothing short of unbelievable given how bad they used to be.


     


    With a monstrous CPU dragging it, relatively low resolutions, and every single thing turned down to low.



    I thought Mac users didn't settle for mediocrity. Guess I'm wrong.


     


    I think at this stage only the high-end MBP 15" and high-end MBP 17" Ivy Bridge MBPs will have discrete GPUs.



    A $1500 pro notebook. Without discrete graphics.



    **** off.


     


    I mean, Ivy Bridge appears to clearly outpace the ATI 5450, the staple low-but-not-horrible-end class of discrete GPU.


     


    If Intel can ramp production suitably, Ivy Bridge is pretty much the nail in the coffin for discrete GPUs, and AMD will struggle in the mid to higher prices.


     


    Given also that while games are useful for benchmarking the integrated GPU, for most mainstream use, if driver support is adequate then it's more than enough.


     


    We're talking 30%-40% improvements over Sandy Bridge. Imagine, playing a DX11 game on an Intel Integrated GPU... albeit at 720p... but still... this is big.


     


    And if you look at general compute of the ~GPU alone~ for Ivy Bridge, pretty much game over for AMD and Nvidia in the mid- to high-priced mainstream-use laptops:


     


    Not only you are comparing it with old hardware, you're assuming that Intel can keep improving the GPUs by 40% every single year until they become comparable to mid-end discrete GPUs, while Nvidia and AMD do nothing but stand still babbling like idiots. That's not going to happen.



    Unless of course you think that everyone but people who can afford $2200 MacBook Pros can use low-end GPUs without a problem.



    In which case, I recommend you visit the mental hospital.


     


    45906.png


     


    You buy one chip. You want fast CPU? It's in that one chip? You want a reasonable GPU for graphics and compute? It's in the same one chip. You want optimised video encoding? Also in the same chip.


     


    Game over. Look at the damned graphic. A low-mid end GPU that's two generations older is beating the crap out of HD4000. I'm not asking for Crysis Warhead on Ultra and 4x Anti Aliasing. I'm asking for a GPU that can play modern games, which could have had much higher minimum requirements were the consoles not dragging the industry and slowing its progress, at native resolutions and AT LEAST no low settings.


     


    The 640M does that, and it fits on an ultrabook, with reasonable power consumption. Does it matter that's not in the same chip?


  • Reply 39 of 63
    ssquirrelssquirrel Posts: 1,196member

    Quote:

    Originally Posted by Lukeskymac View Post


    Not only you are comparing it with old hardware, you're assuming that Intel can keep improving the GPUs by 40% every single year until they become comparable to mid-end discrete GPUs, while Nvidia and AMD do nothing but stand still babbling like idiots. That's not going to happen.


     



     


    Have you looked into Haswell next year?  The move from Sandy Bridge to Ivy Bridge boosted the number of execution units for the GPU from 12 to 16.  Haswell will be making the jump from 16 to 40.  Even if we only got half that multiplier in performance increase, that would be a 125% improvment, which would make that 31fps you linked up nearly 70.  Yeah, with the jump to Haswell, they will certainly be continuing the extreme increase in performance for GPUs tied to their chips. 


     


    Yes I realize that Haswell is a year out and the best card in that linked graphic was a GT440, but currently that card sells for 65-80 bucks.  Imagine the entire sub-$100 video card market disappearing b/c AMD and Intel have better GPUs included in their CPUs.  That is a lot of money to be lost by the GPU companies. 

  • Reply 40 of 63
    mdriftmeyermdriftmeyer Posts: 7,503member

    Quote:

    Originally Posted by SSquirrel View Post


     


    Have you looked into Haswell next year?  The move from Sandy Bridge to Ivy Bridge boosted the number of execution units for the GPU from 12 to 16.  Haswell will be making the jump from 16 to 40.  Even if we only got half that multiplier in performance increase, that would be a 125% improvment, which would make that 31fps you linked up nearly 70.  Yeah, with the jump to Haswell, they will certainly be continuing the extreme increase in performance for GPUs tied to their chips. 


     


    Yes I realize that Haswell is a year out and the best card in that linked graphic was a GT440, but currently that card sells for 65-80 bucks.  Imagine the entire sub-$100 video card market disappearing b/c AMD and Intel have better GPUs included in their CPUs.  That is a lot of money to be lost by the GPU companies. 



     


    Intel's integrated solution for GPU support is hitting it's theoretical ceiling. Keep believing their leaps forward. They will compromise their CPU sooner rather than later to strengthen their GPGPU integrated pipelines.

Sign In or Register to comment.