Intel at CES to show off next-gen of Apple-bound Sandy Bridge processors

123457»

Comments

  • Reply 121 of 136
    MarvinMarvin Posts: 15,445moderator
    Quote:
    Originally Posted by wizard69 View Post


    Yeah great; test the GPU at the lowest settings and then declare it awesome.



    A 10% boost might not be worth it for some people especially if the GPU tanks at higher quality levels.



    They tested it at higher levels later in the benchmarks and it came out slightly slower than the 320M but only by a small amount. Starcraft 2 came out at 20FPS vs 27FPS on the 320M though. In that particular game, that difference is between playable and unplayable before having to reduce settings.



    It is impressive that this GPU is inside the CPU and matches NVidia's fastest ever IGP. They have had 12 months to catch up of course but it's a decent jump. Doubling it with Ivy Bridge will be amazing.



    Just imagine a quad-core CPU with a GPU that's as fast as a 9800M GS on the entry-level that plays Crysis on high quality and gets 1100FPS in Quake 3. I remember when Quake 3 was the ultimate benchmark and computers were struggling to get 30FPS and now 10 years later a mobile GPU can get 1100FPS.



    By 2013, it will be double again and at that point, we're into the realm of the 5750 desktop cards. That's inside a CPU die though. Although higher-end machines may still use dedicated cards, developers don't have the time to build games with the level of detail required to push these games beyond a certain graphical level and there is a constant drive for lower power consumption and smaller form factors. I think they will die down to the point where they are not economically sustainable within a matter of a few years.
  • Reply 122 of 136
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by solipsism View Post


    I?ve never read anything AnandTech article that made me think they were getting paid to speak favourably. What previous reviews have they done are fudged?



    It is an overall feeling I get when reading his articles. Plus it is pretty obvious that he is highly favored by intel and I don't think that is accidental.



    In any event I do not see his articles as balanced at all.

    Quote:

    I don?t get this, on two points. 1) An HTPC doesn?t have to be a super computer and the Mac Mini with the C2D+320M with more than sufficient for streaming or decoded 1080p MKV files while pushing to an HDTV.



    There are a couple of things that cause me to say this. One is that SB ought to lower power which is always a good thing for an always on device. The second is that HTPC need to be able to do more than play movies.



    At least in my case I see a HTPC as more than a movie player.

    Quote:

    2) The Mac Mini as an HTPC is an expensive but poor option unless you?re using a UI designed for it, which the Mac Mini does not come with. Front Row is not an HTPC UI, it?s a 10-foot UI, but it barely has the parts needed and still requires a keyboard/mouse to operate half the time.



    You aren't forced to use Apples software.
  • Reply 123 of 136
    nvidia2008nvidia2008 Posts: 9,262member
    Well, the benchmarks are out at Anandtech (http://www.anandtech.com/show/4083/t...i3-2100-tested)



    As expected, Sandy Bridge has great CPU, but the GPU is still....

    RUBBISH, PLAIN AND SIMPLE.



    No way even the lowest end MacBook would have Sandy Bridge-only graphics. End of story.



    (POST NO.002 For Reference)
  • Reply 124 of 136
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by nvidia2008 View Post


    Well, the benchmarks are out at Anandtech (http://www.anandtech.com/show/4083/t...i3-2100-tested)



    As expected, Sandy Bridge has great CPU, but the GPU is still....

    RUBBISH, PLAIN AND SIMPLE.



    No way even the lowest end MacBook would have Sandy Bridge-only graphics. End of story.



    (POST NO.002 For Reference)



    Imagine what it will be like with Apple drivers! Even with the negatives I can still see Apple implementing the chip as it will likely lead to a compelling lower cost design.
  • Reply 125 of 136
    MarvinMarvin Posts: 15,445moderator
    Quote:
    Originally Posted by nvidia2008 View Post


    Well, the benchmarks are out at Anandtech (http://www.anandtech.com/show/4083/t...i3-2100-tested)



    As expected, Sandy Bridge has great CPU, but the GPU is still....

    RUBBISH, PLAIN AND SIMPLE.



    No way even the lowest end MacBook would have Sandy Bridge-only graphics. End of story.



    (POST NO.002 For Reference)



    It did well in some benchmarks but when they get to "Call of Duty: Black Ops is basically unplayable on Sandy Bridge integrated graphics" that's enough to kill it dead. The 320M can play Black Ops above minimum settings and it's playable, even on the MBA:



    http://www.youtube.com/watch?v=FWfpJNgrMo8



    Mafia 2 at 20FPS on low quality is not good enough.



    I'd still say the HD3000 is a decent attempt by Intel but it falls short of NVidia's last year offerings and the HD2000 is just pointless even having it at all as Anand say.



    I still think it's possible for Apple to switch to SB-only but I hope they don't. Dedicated cards are low enough powered that they don't have to do that.



    One interesting thing at CES is how NVidia announced they are making ARM CPUs now. That will be an interesting move along with Microsoft's ARM compatibility in the OS.
  • Reply 126 of 136
    Quote:
    Originally Posted by wizard69 View Post


    Imagine what it will be like with Apple drivers! Even with the negatives I can still see Apple implementing the chip as it will likely lead to a compelling lower cost design.



    based on what i've been reading through these threads about SB and their crappy graphics. does this mean the upcoming macbook pro 15in/17in (not bothering 13in) will have crappy intel graphics? or will apple still be able to implement amazing AMD/Nvidia graphics in it (along with the crappy intel graphics for low video usage?

    the CPU and GPU talk is really confusing me. i understand SB will be a lot faster, but will the graphics (again 15/17in ones) be an upgrade to the current ones? or will it be slower?
  • Reply 127 of 136
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by blueeddie View Post


    based on what i've been reading through these threads about SB and their crappy graphics. does this mean the upcoming macbook pro 15in/17in (not bothering 13in) will have crappy intel graphics?



    I'd be shocked if all the MBP used SB only graphics. It isn't right for the markets Apple targets with those machines. However I can see Apple coming out with SB only Mac Book, possibly at a greatly reduced price.

    Quote:

    or will apple still be able to implement amazing AMD/Nvidia graphics in it (along with the crappy intel graphics for low video usage?



    Well if they don't MBP sales will tank. The only machine that is a question is the 13" MBP. They have many options for the 13" but I don't think SB is ready GPU wise. Apple could actually beef up the Core 2 one more time in the 13" though I think re jogging for SB and a descrete GPU would be the likely course of action.

    Quote:

    the CPU and GPU talk is really confusing me. i understand SB will be a lot faster, but will the graphics (again 15/17in ones) be an upgrade to the current ones?



    Only if they continue to implement a descrete GPU. I would have to say they have to for the 15" & 17" machines as the GPU in SB is not suitable for these machines. I'm not sure where they will go with the 13" machine.

    Quote:

    or will it be slower?



    It would be totally destructive to lower the video performance on the MBPs. Especially when they are a bit sucky to begin with. I really don't think people would buy them. It has to do with the expectations of the "pro" market. In a Mac Book though I could see a SB only implementaTion. Especially if it allows Apple to come out with a lower cost machine.



    Then there is AMD which honestly seems to be more aligned with Apple than Intel with respect to a future hardware vision. In the end we have to wait for Apple to deliver.
  • Reply 128 of 136
    Quote:
    Originally Posted by wizard69 View Post


    I'd be shocked if all the MBP used SB only graphics. It isn't right for the markets Apple targets with those machines. However I can see Apple coming out with SB only Mac Book, possibly at a greatly reduced price.



    Well if they don't MBP sales will tank. The only machine that is a question is the 13" MBP. They have many options for the 13" but I don't think SB is ready GPU wise. Apple could actually beef up the Core 2 one more time in the 13" though I think re jogging for SB and a descrete GPU would be the likely course of action.



    Only if they continue to implement a descrete GPU. I would have to say they have to for the 15" & 17" machines as the GPU in SB is not suitable for these machines. I'm not sure where they will go with the 13" machine.





    It would be totally destructive to lower the video performance on the MBPs. Especially when they are a bit sucky to begin with. I really don't think people would buy them. It has to do with the expectations of the "pro" market. In a Mac Book though I could see a SB only implementaTion. Especially if it allows Apple to come out with a lower cost machine.



    Then there is AMD which honestly seems to be more aligned with Apple than Intel with respect to a future hardware vision. In the end we have to wait for Apple to deliver.



    thanks a lot for the reply and answers. you definitely cleared up a lot of questions for me.
  • Reply 129 of 136
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by blueeddie View Post


    thanks a lot for the reply and answers. you definitely cleared up a lot of questions for me.



    Just remember this is speculation / educated guessing on my part. Hopefully we will know what is up by the end of the month. The longer the MBP remain un updated the greater the chance of a major overhaul in my mind.
  • Reply 130 of 136
    john.bjohn.b Posts: 2,742member
    Pretty much what I said it would take to get an acceptable IGP on an Intel CPU:



    Ars: Intel/NVIDIA bombshell: look for NVIDIA GPU on Intel processor die



    Think this will kill demand for Sandy Bridge?
  • Reply 131 of 136
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by John.B View Post


    Pretty much what I said it would take to get an acceptable IGP on an Intel CPU:



    Ars: Intel/NVIDIA bombshell: look for NVIDIA GPU on Intel processor die



    Think this will kill demand for Sandy Bridge?



    Looks like a nice plan B for Intel. If AMD's fusion is competitive and gains any traction in the market then Intel can counter this pretty quickly and effective by grafting an NVIDIA designed gpu onto their brawny cpus.



    Makes me think that fusion might be pretty good if and when it ever arrives. Also makes me have serious doubts about Larrabe. I know Programmer has said that he thinks that Intel still will derive successful products based on Larrabe but based on this that seems in doubt IMO.
  • Reply 132 of 136
    nhtnht Posts: 4,522member
    Quote:
    Originally Posted by John.B View Post


    Pretty much what I said it would take to get an acceptable IGP on an Intel CPU:



    Ars: Intel/NVIDIA bombshell: look for NVIDIA GPU on Intel processor die



    Think this will kill demand for Sandy Bridge?



    That is both awesome news and terrible news



    Awesome because Apple will use the new SB CPU with a solid nvidia IGP



    Terrible because Apple will use the new SB CPU with a solid nvidia IGP



    I'm thinking the quick refresh to SB for the Apple lower end line up just took a big hit. Sigh. And I really wanted a new mini soon.
  • Reply 133 of 136
    Quote:
    Originally Posted by John.B View Post


    Pretty much what I said it would take to get an acceptable IGP on an Intel CPU:



    Ars: Intel/NVIDIA bombshell: look for NVIDIA GPU on Intel processor die



    Think this will kill demand for Sandy Bridge?



    That's great, except that it's not true at all. Licensing Nvidia's IP couldn't be farther from using an Nvidia-designed GPU.
  • Reply 134 of 136
    Quote:
    Originally Posted by Marvin View Post


    One interesting thing at CES is how NVidia announced they are making ARM CPUs now. That will be an interesting move along with Microsoft's ARM compatibility in the OS.



    Look at how good the 1GHz handheld ARMs are and imagine them with an architecture designed for desktops, rather than low power, running at 3GHz. Zowie!
  • Reply 135 of 136
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by randian View Post


    Look at how good the 1GHz handheld ARMs are and imagine them with an architecture designed for desktops, rather than low power, running at 3GHz. Zowie!



    ARM hardware is designed for very low power points commensurate with good performance. They don't even remotely play with in desktop territory and honestly aren't even 64 bit. Even at 3GHz an ARM CPU would have serious trouble trying to compete against an i86 platform running the same clock rate.



    That is if you measure performance as the ability to get work done. If you measure performance buy looking at power usage first then of course ARM wins. Every time a discussion about ARM performance crops up I get a little disturbed as people seem to equate system performance with CPU performance and frankly it isn't on many of the ARM systems. ARM works for Apple and others because they use dedicated hardware to deliver video playback and the GPU as much as possible so that the CPU does little heavy lifting. When an ARM CPU is put in the position of having to exercise its CPU to deliver app functionality it often falls flat on its face.
  • Reply 136 of 136
    Quote:
    Originally Posted by wizard69 View Post


    ARM hardware is designed for very low power points commensurate with good performance. They don't even remotely play with in desktop territory and honestly aren't even 64 bit.



    I'm pretty sure that's why I said "imagine them with an architecture designed for desktops, rather than low power".
Sign In or Register to comment.