macbooks next integrated gfx card

Posted:
in Future Apple Hardware edited January 2014
The 9400m has performed very good, but does anyone one what gfx apple is going to use next ?

with all the issues between intel and nvidia any ideas what are we seeing nxt ?

Comments

  • Reply 1 of 9
    MarvinMarvin Posts: 15,310moderator
    Intel will introduce 32nm Arrandale dual core probably early 2010 and they are putting a die-shrunk GMA 4500 chip in there. This chip is quite a bit slower than the 9400M. Putting it inside the CPU is supposed to reduced bottlenecks but it's the same chip so it likely won't improve much.



    Intel's graphics chips are just not well supported in apps and generally perform very poorly. The only consolation I would get from the internal integrated chip is if Apple decide to put two chips inside every machine.



    When you do GPU computing, your interface can get sluggish so if they can run the interface off the CPU graphics chip and do everything else off the other one - an Nvidia or ATI chip - then it will be ok.



    If Apple only use the Arrandale CPU, sadly I think it will be a downgrade and I certainly wouldn't take one over a 9400M-based machine.



    Apple have already done this though - the new iMacs switched to the 9400M and the old 2400XT's were slightly faster. The drop wasn't significant but a drop nonetheless. Dropping to a 4500 Intel chip would be worse as they have compatibility as well as performance problems.
  • Reply 2 of 9
    ghstmarsghstmars Posts: 140member
    we know that amd/ati and nvidia are fully on board with opencl, but what about intel? a die shrink of the 4500 is a step back like you mentioned, again the macbooks hampered by the igp ....







    Quote:
    Originally Posted by Marvin View Post


    Intel will introduce 32nm Arrandale dual core probably early 2010 and they are putting a die-shrunk GMA 4500 chip in there. This chip is quite a bit slower than the 9400M. Putting it inside the CPU is supposed to reduced bottlenecks but it's the same chip so it likely won't improve much.



    Intel's graphics chips are just not well supported in apps and generally perform very poorly. The only consolation I would get from the internal integrated chip is if Apple decide to put two chips inside every machine.



    When you do GPU computing, your interface can get sluggish so if they can run the interface off the CPU graphics chip and do everything else off the other one - an Nvidia or ATI chip - then it will be ok.



    If Apple only use the Arrandale CPU, sadly I think it will be a downgrade and I certainly wouldn't take one over a 9400M-based machine.



    Apple have already done this though - the new iMacs switched to the 9400M and the old 2400XT's were slightly faster. The drop wasn't significant but a drop nonetheless. Dropping to a 4500 Intel chip would be worse as they have compatibility as well as performance problems.



  • Reply 3 of 9
    futurepastnowfuturepastnow Posts: 1,772member
    It's not quite as horrible as Marvin makes it out to be. The Arrandale IGP will be derived from Intel's current integrated graphics, but it will not be identical to them. It's reasonable to assume that more stream processors and other hardware will be added (X4500 has 10 SPs). Also, shrinking it from 65nm to 45nm and leveraging the CPU cooler will allow it to be clocked significantly higher, though probably more on the desktop than mobile.



    It probably will not be an improvement over the Geforce 9400, but it should perform similarly, and will likely be cheaper.
  • Reply 4 of 9
    ghstmarsghstmars Posts: 140member
    nvidia coming out with the igt209, no real info available just that it is coming out q3 '09. i guess apple, againa trying to keep those high margins, might use the less expensive intel igp, which hasnt proven itself . i just dont get it pushing the gfx more than ever, i would assume they go with the best card available in every segment. mac pro are getting there, imac eventually, if the redesign the case, the 'books who knows.
  • Reply 5 of 9
    kukitokukito Posts: 113member
    It looks like the new Intel integrated graphics will be much better than what they currently offer.
  • Reply 6 of 9
    ghstmarsghstmars Posts: 140member
    at least we know they are involved in OpenCl, so cannot be that bad. But Nvidia still has the edge in igp, i just hope nvidia and intel can patch things up ( cross license and all ), with the igt209 , the nehalem processor and snow leopard, the 13 macbook makes a nice package.
  • Reply 7 of 9
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Marvin View Post


    Intel will introduce 32nm Arrandale dual core probably early 2010 and they are putting a die-shrunk GMA 4500 chip in there. This chip is quite a bit slower than the 9400M. Putting it inside the CPU is supposed to reduced bottlenecks but it's the same chip so it likely won't improve much.



    I don't have much faith in Intels GPUs so they will have a lot of people like me to convince that this arraingement will work. GPUs really have been that bad from Intel.



    As to the potential for an on chip GPU, I'm certain the arraingement could improve things if implemented well. Ideally they would put the graphics memory buffer on chip also. If the buffer isn't there I can't see huge increases in performance over similar Intel GPUs.



    Quote:



    Intel's graphics chips are just not well supported in apps and generally perform very poorly. The only consolation I would get from the internal integrated chip is if Apple decide to put two chips inside every machine.



    Basically Intel GPUs suck.



    The Dual GPU approach does have it's benefits, especially if the use of OpenCL really takes off. As has been noted GPU processing can really load a system down.

    Quote:



    When you do GPU computing, your interface can get sluggish so if they can run the interface off the CPU graphics chip and do everything else off the other one - an Nvidia or ATI chip - then it will be ok.



    That depends but I think you already know that. Bandwidth and data transfers become real issues. This could be significant if you loose a DRAM channel. To some "it depends" sounds like a cop out, but it really isn't as utilization can vary widely with the usage.

    Quote:

    If Apple only use the Arrandale CPU, sadly I think it will be a downgrade and I certainly wouldn't take one over a 9400M-based machine.



    I'd take a wait and see attitude.



    Generally I would agree it would be an extremely poor choice on Apples part. But then again I look at the last iMac update and really wonder what the hell Apple is up to.

    Quote:



    Apple have already done this though - the new iMacs switched to the 9400M and the old 2400XT's were slightly faster. The drop wasn't significant but a drop nonetheless. Dropping to a 4500 Intel chip would be worse as they have compatibility as well as performance problems.



    I can't see Apple going that far backwards if they have a reasonable choice. It does look like Intel is behaving very badly with respect to new CPU design. So maybe Apple wont have a choice.



    Actually for some platforms Apple should look at AMD. If for nothing else it would help keep intel honest.







    Dave
  • Reply 8 of 9
    MarvinMarvin Posts: 15,310moderator
    Quote:
    Originally Posted by FuturePastNow View Post


    It's reasonable to assume that more stream processors and other hardware will be added (X4500 has 10 SPs). Also, shrinking it from 65nm to 45nm and leveraging the CPU cooler will allow it to be clocked significantly higher, though probably more on the desktop than mobile.



    In order to equal the 9400M, the shader clock would have to be 1.1GHz vs 533MHz and the 9400M has 16 SPs. So more than double the clock speed and adding over 50% more SPs. It's possible but I'd imagine part of the die-shrink is simply to fit it inside the CPU. Plus, we are comparing Intel's offering at least 6 months from now with Nvidia's current offering.



    As mentioned, Nvidia will have a new iGT209 chip out on 40nm in Q3 this year. The 9400M is only just able to play Crysis at the lowest settings. If the iGT209 matches the 9600M GT (basically double the SPs in the 9400M), it would allow even Apple's lowest end machines some pretty decent performance.



    Quote:
    Originally Posted by FuturePastNow View Post


    It probably will not be an improvement over the Geforce 9400, but it should perform similarly, and will likely be cheaper.



    It may be cheaper if they went for Intel chipsets but I don't know how Nvidia's chipsets compare with Intel's on price. Plus it's not going to affect the rest of the lineup as only Arrandale has the GPU inside not the higher quad cores. If they stick with Nvidia boards, it won't save any money, you'll just be getting an extra GPU chip for free.



    Quote:
    Originally Posted by Wizard69


    GPUs really have been that bad from Intel.



    That's really the issue. It's not that Intel could never better themselves but their track record is appalling. They have consistently produced the lowest performing chips, which cause all manner of graphics glitches and incompatibilities.



    By contrast Nvidia's current chips are fully supported in all software packages and perform extremely well.



    I agree with the stance of not judging them before they have a chance to prove themselves but similarly there's no reason to assume it will be different this time round. If they do something that matches the 9400M, I'll be impressed but at the same time, it will probably still be inferior to Nvidia's alternative.



    What disappoints me is that Nvidia will inevitably lose the race even though they've been consistently better over the years. Once graphics get inside the CPU, it's the beginning of the end for Nvidia. ATI/AMD will do the same.



    Once Intel can squeeze 100 x 1GHz SPs inside a CPU, Nvidia is finished. I'd say 3-5 years tops.
  • Reply 9 of 9
    benroethigbenroethig Posts: 2,782member
    If the integrated graphics available don't do the job, I would see Apple returning to a dedicated chip. I don't see them taking a step back again in that direction.
Sign In or Register to comment.