I wrote at length a few months ago and I was definitely bashing Intel heavily.
Larrabee was "postponed" several months ago and they said it was still going forward somewhere, somehow. I called vapourware, and indeed, I'm just happy to see Larrabee roll off into faildom.
The only people making any sense of real "graphics" right now is AMD/ATI. Cool-running, great performance, sensible architecture.
At the end of the day though with Sandy Bridge Intel graphics will be standard for everyone. Most people want it just to play some MMO stuff and HD videos. OpenCL is still a few years off and PC gaming, while not dead, is niche nowadays especially with most people getting laptops.
Intel doesn't need Larrabee because why bother? With Nvidia locked out of the chipset and AMD a distinct but not overwhelming competitor, Intel can continue their Bundlegate ~ essentially product dumping onto users nonsense GPUs.
On OpenCL, despite the promise, a year or so on, we haven't seen anything compelling. In fact, the market has devolved* to the iPad and my goodness, massive amounts of netbooks and cheap, cheaper, cheapest PC laptops.
For most users nowadays, compute power is mostly needed for Facebook and videos.
*I don't necessarily mean this in a derogatory way w.r.t. iPad
On OpenCL, despite the promise, a year or so on, we haven't seen anything compelling.
These things take time to come to fruition. Everyone these days is into instant gratification, but realize that development timelines are measured in years.
These things take time to come to fruition. Everyone these days is into gratification, but realize that development timelines are measured in years.
Fair enough, but maybe I bought into the OpenCL RDF a bit too much when Snow Leopard came out. At that time I kept thinking that even my 9400M would, after at least 1 year, have *some* use. I know it does, in various aspects of Mac OS and software... But, Bertrand Serlet had me dreaming big.
Fair enough, but maybe I bought into the OpenCL RDF a bit too much when Snow Leopard came out. At that time I kept thinking that even my 9400M would, after at least 1 year, have *some* use. I know it does, in various aspects of Mac OS and software... But, Bertrand Serlet had me dreaming big.
Keep the dream alive. Software will slowly show up that uses it, and future hardware will be more general purpose, making it easier to use effectively. These things take time, and lots of it. The spec only ratified last year, then the implementation shipped in September with SnowLeopard, and since then developers who need the compute power have been learning and experimenting with it. This isn't like switching to a new C++ compiler, it requires learning on the part of the developers and then time to figure out how to apply it to their application. And time for the tech to mature, bugs to get hammered out, optimizations to be made, etc. Not to mention figuring out what to use it for, which is more non-trivial than you might imagine.
Comments
Larrabee was "postponed" several months ago and they said it was still going forward somewhere, somehow. I called vapourware, and indeed, I'm just happy to see Larrabee roll off into faildom.
The only people making any sense of real "graphics" right now is AMD/ATI. Cool-running, great performance, sensible architecture.
At the end of the day though with Sandy Bridge Intel graphics will be standard for everyone. Most people want it just to play some MMO stuff and HD videos. OpenCL is still a few years off and PC gaming, while not dead, is niche nowadays especially with most people getting laptops.
Intel doesn't need Larrabee because why bother? With Nvidia locked out of the chipset and AMD a distinct but not overwhelming competitor, Intel can continue their Bundlegate ~ essentially product dumping onto users nonsense GPUs.
For most users nowadays, compute power is mostly needed for Facebook and videos.
*I don't necessarily mean this in a derogatory way w.r.t. iPad
On OpenCL, despite the promise, a year or so on, we haven't seen anything compelling.
These things take time to come to fruition. Everyone these days is into instant gratification, but realize that development timelines are measured in years.
These things take time to come to fruition. Everyone these days is into gratification, but realize that development timelines are measured in years.
Fair enough, but maybe I bought into the OpenCL RDF a bit too much when Snow Leopard came out. At that time I kept thinking that even my 9400M would, after at least 1 year, have *some* use. I know it does, in various aspects of Mac OS and software... But, Bertrand Serlet had me dreaming big.
Fair enough, but maybe I bought into the OpenCL RDF a bit too much when Snow Leopard came out. At that time I kept thinking that even my 9400M would, after at least 1 year, have *some* use. I know it does, in various aspects of Mac OS and software... But, Bertrand Serlet had me dreaming big.
Keep the dream alive. Software will slowly show up that uses it, and future hardware will be more general purpose, making it easier to use effectively. These things take time, and lots of it. The spec only ratified last year, then the implementation shipped in September with SnowLeopard, and since then developers who need the compute power have been learning and experimenting with it. This isn't like switching to a new C++ compiler, it requires learning on the part of the developers and then time to figure out how to apply it to their application. And time for the tech to mature, bugs to get hammered out, optimizations to be made, etc. Not to mention figuring out what to use it for, which is more non-trivial than you might imagine.