Sandy Bridge CPU preview at Anandtech

24

Comments

  • Reply 21 of 63
    Quote:
    Originally Posted by wizard69 View Post


    The general user simply isn't going to understand or even want to hear about heterogenous systems or the trade offs between the two approaches. They will simply leave it up to the software vendor to tell them which is the better choice for the program they are running. That is the software vendor will be expected to spec a minimal GPU for exceptional performance with their code.



    Sorry, wasn't very clear about who I meant by "people" in that context. Wasn't referring to the general user... was referring to technically oriented sorts, including developers.



    Quote:

    GPU's in general have out striped general CPU's in the way they have gain performance in recent years. This is what makes GPU's so useful for certain classes of problems. The fact that they have far less baggage to maintain means that they will continue to advance at a very fast rate performance wise.



    I don't think GPUs have had such a rush because of their lack of baggage. Lack of baggage didn't help PowerPC or IA-64 a whole lot in the face of x86/x64. GPUs have experienced a rapid rate of improvement mostly because they have been able to carefully constrain the problem they are solving -- doing more-or-less the same thing to lots of uniformly organized pieces of data.



    Quote:

    I'm not willing to throw in the towel for AMD yet.



    If you read my previous post you'll see that I'm not either, but in the long run I don't have a very rosy prognosis for them. I'm also not so pessimistic about Intel's GPU work, nor so optimistic about ATI's legacy. Larrabee may not have succeeded (yet), but it did bring a lot of people into Intel who "get it" when it comes to graphics. They're still there.
     0Likes 0Dislikes 0Informatives
  • Reply 22 of 63
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by Programmer View Post


    Sorry, wasn't very clear about who I meant by "people" in that context. Wasn't referring to the general user... was referring to technically oriented sorts, including developers.



    Exactly and it is up to the developer to figure out how structure the program for best performance. GPUs simply aren't useful for many problems but where they can be leveraged the programmers should be expected to.

    Quote:

    I don't think GPUs have had such a rush because of their lack of baggage. Lack of baggage didn't help PowerPC or IA-64 a whole lot in the face of x86/x64.



    Yes but there is one very significant difference, all computers need to have some sort of GPU anyways. Well at least computers for use of common apps on the desk or lap. This is very important as GPUs will never go away.

    Quote:

    GPUs have experienced a rapid rate of improvement mostly because they have been able to carefully constrain the problem they are solving -- doing more-or-less the same thing to lots of uniformly organized pieces of data.



    That is exactly what makes GPUs so powerful for certain classes of programs.

    Quote:







    If you read my previous post you'll see that I'm not either, but in the long run I don't have a very rosy prognosis for them. I'm also not so pessimistic about Intel's GPU work, nor so optimistic about ATI's legacy.



    My personal opinion is that we have a very long way to go before GPUs are powerful enough that development will cease. In fact with the high density displays that will be possible very soon now, I suspect the average GPU will be strained. Imagine a 27" iMac with a 250 to 300 dpi screen. GPU development won't slow down anytime soon.

    Quote:

    Larrabee may not have succeeded (yet), but it did bring a lot of people into Intel who "get it" when it comes to graphics. They're still there.



    Well hopefully they are working in seclusion to surprise us all. Honestly though I'm not convinced because it seems to be Intels goal to supply minimal graphical systems. They simply want something to go after the bulk of the business or a good enough market.





    Dave
     0Likes 0Dislikes 0Informatives
  • Reply 23 of 63
    mcarlingmcarling Posts: 1,106member
    Quote:
    Originally Posted by backtomac View Post


    Intel have become somewhat predictable in regards to performance improvements with the 'tick, tock' cpu upgrade cycles.



    The ticks (architectural changes) give about a 20 % improvement. The tocks (die shrinks) give about 5% clock speed improvement, better battery life for the mobile parts and a couple more cores on the high end parts.



    For macs it seems the best bet is to wait for the architectural changes.



    Each die shrink gives about double the performance per watt consumed. Since mobile performance is limited by heat dissipation and battery life, most of the advances for laptops come with the die shrinks and the new micro-architectures offer very little. On the desktop it's a different story.
     0Likes 0Dislikes 0Informatives
  • Reply 24 of 63
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by mcarling View Post


    Each die shrink gives about double the performance per watt consumed. Since mobile performance is limited by heat dissipation and battery life, most of the advances for laptops come with the die shrinks...



    Well then Intel are cheating us.



    Do you think you get double the performance or twice the battery life with each die shrink?
     0Likes 0Dislikes 0Informatives
  • Reply 25 of 63
    anand the shrimp is a shill for intel, a boring marketing site full of pc trolls that climax with minor tec details.



    Intel is ...ed because they failed to develop an even barely comparable gpu to nvidia and amd. And gpus are gaining increasing relevance in both laptops and dekstops. Whatever x86 minor gains or die shrinks they can leverage via their large infrastructure they can't mask their glaring inadequacy by pre shipping some kid with a new chip and having the shill chorus praise them. They also managed to screw nvidia AND apple over with this stunt the pulled in the i-something chips of including an unbearably crap and backward non open gl igfx on the same die. A cheap, bullish way to shove an anachronistic product down everyone's throat and do away with nvidia, all in one coup. It's the reason why the macbooks are stuck with the cores and why the air has taken aeons to be updated.



    I don't think apple will take this lightly. Apple doesn't care what's inside, and apple customers don't either care about lists of specs as long as the end product is effective. And I think that provided amd overcome some hurdles then they pose the a very interesting proposition for apple for a tight, integrated and exclusive partnership that will give apple the best gfx on the globe and some great apus (gpus and cpus) too at very good prices. I think amd will finally join apple for their mutual benefit.
     0Likes 0Dislikes 0Informatives
  • Reply 26 of 63
    I don't understand the hangup over OpenCL or GPU's in general under OSX; I know Apple wants OpenCL, but it hasn't gone anywhere. Additionally, Apple is behind the curve on OpenGL extensions anyhow.



    Apple had no problem using Intel GPU's before (GMA 950 and x3100) - they were crap, but that didn't stop them, at least not until OpenCL, which again, has gone nowhere. It seems more like a carrot and stick to me.



    Honestly, I could probably get by with Intel GMA HD graphics right now on my PC (using an AMD 5770), as it would still acceleration video playback and Flash, if not for the occasional game. That's probably all the average consumer would want as well, and if Intel could fit the bill, that's probably the ticket.
     0Likes 0Dislikes 0Informatives
  • Reply 27 of 63
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by myapplelove View Post


    anand the shrimp is a shill for intel, a boring marketing site full of pc trolls that climax with minor tec details.



    Intel is ...ed because they failed to develop an even barely comparable gpu to nvidia and amd. And gpus are gaining increasing relevance in both laptops and dekstops. Whatever x86 minor gains or die shrinks they can leverage via their large infrastructure they can't mask their glaring inadequacy by pre shipping some kid with a new chip and having the shill chorus praise them. They also managed to screw nvidia AND apple over with this stunt the pulled in the i-something chips of including an unbearably crap and backward non open gl igfx on the same die. A cheap, bullish way to shove an anachronistic product down everyone's throat and do away with nvidia, all in one coup. It's the reason why the macbooks are stuck with the cores and why the air has taken aeons to be updated.



    I don't think apple will take this lightly. Apple doesn't care what's inside, and apple customers don't either care about lists of specs as long as the end product is effective. And I think that provided amd overcome some hurdles then they pose the a very interesting proposition for apple for a tight, integrated and exclusive partnership that will give apple the best gfx on the globe and some great apus (gpus and cpus) too at very good prices. I think amd will finally join apple for their mutual benefit.



    In this regard we share a common view. AMD has the potential to place a lot of their Bobcat based Ontario chips, even in Apple products. If the rumors are true they would be perfect for the AIR line. Given a high enough clock they might even power a Mini and some of the Mac Books. In the short term Ontario is likely the only path to quad cores in low end laptops.



    I'm really hoping AMD is successful here as they could put a hurt on Intel. As indicated Apple has to be a little bit pissed at Intel.





    Dave
     0Likes 0Dislikes 0Informatives
  • Reply 28 of 63
    Marvinmarvin Posts: 15,580moderator
    Quote:
    Originally Posted by guinness View Post


    I don't understand the hangup over OpenCL or GPU's in general under OSX; I know Apple wants OpenCL, but it hasn't gone anywhere. Additionally, Apple is behind the curve on OpenGL extensions anyhow.



    OpenCL development needs to ramp up but the driver support needs to be complete. OpenCL 1.1 was only ratified on June 14th this year:



    http://www.khronos.org/news/press/re...uting-standard



    It's going through the necessary cycle of testing, finding missing features and then adding them. Once that reaches a certain level, it can be used more. Ideally, OpenGL shouldn't need extensions at all. If you take the example of rendering engines like VRay, Mental Ray and Renderman, they are used to produce photorealistic 3D content and the "extensions" are basically just shader kernels written in a computational language. If you need a new lighting model such as ambient occlusion or subsurface scattering, you just write one in software and you can reuse it on any machine. Slower machines simply run it slower or you can set options in the shader functions to take fewer samples.



    This is where GPUs need to get to and once they do, it will have a big impact on not only games but movie production.



    If Apple got Pixar to develop a Renderman-compliant engine using OpenCL that can integrate with physics engines like Havok etc and got it to work on the iPhone as well as in Motion/Final Cut, well that would be pretty neat.



    Quote:
    Originally Posted by guinness View Post


    Honestly, I could probably get by with Intel GMA HD graphics right now on my PC (using an AMD 5770), as it would still acceleration video playback and Flash, if not for the occasional game. That's probably all the average consumer would want as well, and if Intel could fit the bill, that's probably the ticket.



    It will get to that point. Despite Intel's slowness, it's a long game and once they hit that threshold of the GPU being adequate for most things then it won't matter one bit to the consumer. It's unfortunate that Intel might win in the end as they have always by far been the worst in this field and simply don't deserve to win.
     0Likes 0Dislikes 0Informatives
  • Reply 29 of 63
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by guinness View Post


    I don't understand the hangup over OpenCL or GPU's in general under OSX; I know Apple wants OpenCL, but it hasn't gone anywhere. Additionally, Apple is behind the curve on OpenGL extensions anyhow.



    I'm not sure where you get your info or where your opinion come from but i don'y think they are supported by the facts. Apples been very successful with OpenCL with just about every GPU vendor on board with support. That includes Imagination. On the implementation side I've seen much that indicates that people are having success with OpenCL.



    OpenCL doesn't have anything to do wuth OpenGL extensions, other that the possibility of implementing those extensions in OpenCL. OpenCLs future rests in uses outside of the graphics world. That doesn't mean that OpenCL won't be important in the graphics world, just that it is a limited world.

    Quote:



    Apple had no problem using Intel GPU's before (GMA 950 and x3100) - they were crap, but that didn't stop them, at least not until OpenCL, which again, has gone nowhere.



    Your grasp of history is wanting here. The first Intel machines had these GPUs because Apple teamed up with Intel to switch to i86. At which point the consumners started to complain loudly. OpenCL had very little to do with the move away from Intel integrated graphics.



    In any event where do you get this idea that OpenCL has gone no where? Apple uses it as do third party vendors. You wouldn't know if a piece of software used OpenCL or not, because the reccomendation is for fallbacks to other resources like the main CPU. The onky potential indicator of OpenCL use is faster execution.



    In any event please tell us why you think OpenCL has gone no where.

    Quote:

    It seems more like a carrot and stick to me.



    Honestly, I could probably get by with Intel GMA HD graphics right now on my PC (using an AMD 5770), as it would still acceleration video playback and Flash,



    Maybe maybe not it depends upon you expectations and usage. For many, Intels GPUs are so slow as to be worthless. Some will be happy but then again some people still drive VW busses built in the sixties.

    Quote:

    if not for the occasional game.



    This is the sort of garbage that really frustrates me. You do realize that a good GPU can be put to advantage for a lot more than games, right? Maybe the occasional game is the only stress you put on the GPU in your system. If so great, but I offer up this, in the future you won't know exactly what parts of the OS or installed apps are using OpenCL code or the GPU in general.

    Quote:

    That's probably all the average consumer would want as well, and if Intel could fit the bill, that's probably the ticket.



    Sadly the average consummer is not well informed. In part that is a marketing problem, one that Apple seems to have a good grasp on. They also have apps that might benefit from OpenCL dramatically. A little marketing polish on an app that demonstrates the advantages of OpenCL would keep people interested in buying a good GPU.



    Besides all of the above people mis important elements in the discussion. Number one is that I don't believe that GPU performance demand will level off any time soon. If nothing else higher density displays will be a huge factor. IPhone has the Retina display which is a huge improvement. Now imagine a 27" screen at a third, two thirds or the same density. The more pixels to compute the more GPU power needed.





    Dace
     0Likes 0Dislikes 0Informatives
  • Reply 30 of 63
    Quote:
    Originally Posted by backtomac View Post


    Well then Intel are cheating us.



    Do you think you get double the performance or twice the battery life with each die shrink?



    Amdahl's Law, pal. The processor is only one component in a very complex system.
     0Likes 0Dislikes 0Informatives
  • Reply 31 of 63
    From a consumer perspective, I tend to agree with guinness. Unfortunately I'm a professional, so Intel's "GPUs" are of no use to me.



    I do think that if we were to stretch out GPU timelines waaay into the future, Intels GPU philosophy may have some merit. But in the here and now, and certainly the next 5 years AFAIK, Intel's GPUs are an added cost for me, that don't perform in any meaningful sense of the word.



    OpenCL I think may be more useful to the Pro market, ie the Get things Done Quickly With My Expensive Gear market. Like 3D rendering, or Audio DSP, protein folding, or nuclear bomb simulations, etc. I do think tapping into latent Teraflop performance is a good idea™.



    OpenCL for the consumer is nice, but certainly not a deal breaker for light web surfing and email checking.
     0Likes 0Dislikes 0Informatives
  • Reply 32 of 63
    Quote:
    Originally Posted by Marvin View Post


    OpenCL development needs to ramp up but the driver support needs to be complete. OpenCL 1.1 was only ratified on June 14th this year:



    http://www.khronos.org/news/press/re...uting-standard



    I think guinness was more pointing that Apple may have other, more immediate challenges in anything remotely GPU related, more than implying OpenC/GL are related. We aren't privy to Apple's priorities, but I would assume fixing their drivers and OpenGL are more important to a majority of their users than fringe OpenCL usage cases.



    Quote:

    It's going through the necessary cycle of testing, finding missing features and then adding them. Once that reaches a certain level, it can be used more. Ideally, OpenGL shouldn't need extensions at all. If you take the example of rendering engines like VRay, Mental Ray and Renderman, they are used to produce photorealistic 3D content and the "extensions" are basically just shader kernels written in a computational language. If you need a new lighting model such as ambient occlusion or subsurface scattering, you just write one in software and you can reuse it on any machine. Slower machines simply run it slower or you can set options in the shader functions to take fewer samples.



    This is where GPUs need to get to and once they do, it will have a big impact on not only games but movie production.



    Fully programmable shader pipes don't already accomplish this? Isn't that why OpenCL came about?



    Quote:

    If Apple got Pixar to develop a Renderman-compliant engine using OpenCL that can integrate with physics engines like Havok etc and got it to work on the iPhone as well as in Motion/Final Cut, well that would be pretty neat.



    It will get to that point. Despite Intel's slowness, it's a long game and once they hit that threshold of the GPU being adequate for most things then it won't matter one bit to the consumer. It's unfortunate that Intel might win in the end as they have always by far been the worst in this field and simply don't deserve to win.



    Agreed. It's almost... monopolistic or something... Forcing a product that 'might not be the best available on the market*' onto everybody's machines to increase sales.



    The only upshot is it gives AMD a very clear window of opportunity to potentially crush Intel's performance lead in the market.





    *going for understatement of the year award.
     0Likes 0Dislikes 0Informatives
  • Reply 33 of 63
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by 1337_5L4Xx0R View Post


    From a consumer perspective, I tend to agree with guinness....

    OpenCL for the consumer is nice, but certainly not a deal breaker for light web surfing and email checking.



    There is this belief that the Intel's new IGPs are 'good' enough and that they are going to cannibalize sales of discrete GPUs. Do you need a GPU capable of teraflop of computing performance to browse the web and check email?



    That may be true at least initially. But what's to stop iPads and the like (android tablets) from cannibalizing them? In other words, if you don't need a dedicated GPU to do your computer tasks, browse the web and check email, do you need a quad core CPU? Why do people need a Sandy Bridge CPU to do these things?
     0Likes 0Dislikes 0Informatives
  • Reply 34 of 63
    Agreed, and it just makes Intels IGP 'fusion' strategy all the more baffling, except on Atoms, where they belong.
     0Likes 0Dislikes 0Informatives
  • Reply 35 of 63
    Quote:
    Originally Posted by backtomac View Post


    There is this belief that the Intel's new IGPs are 'good' enough and that they are going to cannibalize sales of discrete GPUs. Do you need a GPU capable of teraflop of computing performance to browse the web and check email?



    That may be true at least initially. But what's to stop iPads and the like (android tablets) from cannibalizing them? In other words, if you don't need a dedicated GPU to do your computer tasks, browse the web and check email, do you need a quad core CPU? Why do people need a Sandy Bridge CPU to do these things?



    You make a good point about browsing the web and checking email, but there are plenty of "professional" computer tasks that involve lots of number crunching without fancy graphics. A powerful CPU and "free" integrated graphics makes a lot of sense in a lot of places.
     0Likes 0Dislikes 0Informatives
  • Reply 36 of 63
    Quote:
    Originally Posted by FuturePastNow View Post


    You make a good point about browsing the web and checking email, but there are plenty of "professional" computer tasks that involve lots of number crunching without fancy graphics. A powerful CPU and "free" integrated graphics makes a lot of sense in a lot of places.



    You make a good point.



    But that begs the question, would those apps that need to do number crunching benefit from OCL? If so then they would benefit from a powerful GPU.
     0Likes 0Dislikes 0Informatives
  • Reply 37 of 63
    Bingo. And in steps AMD in 2011.
     0Likes 0Dislikes 0Informatives
  • Reply 38 of 63
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by backtomac View Post


    There is this belief that the Intel's new IGPs are 'good' enough and that they are going to cannibalize sales of discrete GPUs. Do you need a GPU capable of teraflop of computing performance to browse the web and check email?



    Look at the old i86 Macs and the number one complaint there. It was all directed at the crappy IGP of those machines and rightfully so. Even for browsing the web they where a bit under powered. A slow GUI will leave people with the impression the machine is slow.



    So obviously you don't need that high performance GPU. On the otherhand most people didn't want the sluggishness of Intels solution. It is notable that NVidias 9400m had a dramatic impact on Apples Hardware spuring strong sales. In many ways the 9400m is a fairly pedestrian processor yet it had the effect of giving Apples low end machines respectable performance when judged against the alternatives.

    Quote:

    That may be true at least initially. But what's to stop iPads and the like (android tablets) from cannibalizing them? In other words, if you don't need a dedicated GPU to do your computer tasks, browse the web and check email, do you need a quad core CPU?



    Well you never use to need a quad core to browse but the people started using flash! Your point is a good one and it depends upon what your goals are. If you assume that the only interest in a desk top is to browse then they do tend to look over powered. Not everybody is so interested though and look at mail and browsing as one of many tasks they do.



    What I seem to be hearing here is that you think that the constant rush to better and better GPUs will go away with the coming tablets. Nothing could be further from the truth. Samsung just released a little info on their new Cortex A9 based SoC. One bullet point is 4x better 3D graphics. In the end iPad is already suitable for Mail and browsing but yet one evolutionary path will give us much better graphics. Why go that route? Probably because almost everybody does more than e-Mail and browsing on their computers. Even browsing can demand a lot from the GPU to support the newer features like SVG.

    Quote:

    Why do people need a Sandy Bridge CPU to do these things?



    If that is all they did then they don't need Sandy Bridge or for that matter a PC. I for one am very surprised at how much of my E-Mail is now done on an iPhone. Even at that I'm lusting after iPhone 4 as it has better overall performance due in part to the GPU. Which brings up another thing we could soon see chips in an iPad that support OpenCL. I don't expect iPad to ever be used for high performance computing but yet I have to admit that iPad is far more powerful than most of the computers I've ever owned. That is the current model, put a dual core chip in an iPad and it will be the second SMP macine I've owned.



    It is rather incredible to look at what hadheld devices can do these days. Ny little 3G iPhone outclasses many of those computers I've used over the years. The reality is each process shrink gives engineers a lot more room to plug functionality into a chip or computer. A lot of that space goes to the GPU because we get a big pay off there.



    One last thing. We are talking about GPUs driving current display technology. What sort of GPU is required to drive a 4K display. Or lets say Apple goes a step further with their marketing and decides all their displays should equal the Retina standard, that is a density that exceeds the ability of most people to resolve at working distances. How much GPU power will that take for Mail? How about 3D graphics? The good thing is higher performance GPUs permit the move to higher resolution displays. The original video processors gad trouble with video CRTs, now we drive displays with millions of colors and pixels. One day we will have GPUs capable of driving 3D holographic projection devices on our desktops. They will still be more than you need for E-Mail but just imagine the porn.





    Dave
     0Likes 0Dislikes 0Informatives
  • Reply 39 of 63
    backtomacbacktomac Posts: 4,579member
    Quote:
    Originally Posted by wizard69 View Post


    What I seem to be hearing here is that you think that the constant rush to better and better GPUs will go away with the coming tablets.

    Dave



    No. I think tablets will diminish the demand for machines with IGPs.
     0Likes 0Dislikes 0Informatives
  • Reply 40 of 63
    Quote:
    Originally Posted by backtomac View Post


    You make a good point.



    But that begs the question, would those apps that need to do number crunching benefit from OCL? If so then they would benefit from a powerful GPU.



    Maybe, but even with open CL, the % of code that can run on a GPU is very small.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.