Apple in advanced discussions to adopt AMD chips

17810121320

Comments

  • Reply 181 of 395
    Quote:

    Originally Posted by melgross

    The only chips AMD offers that are good, are 4 socket devices, which Apple can't use. Intel has then as well. All are too expensive, and suck too much power.



    The rest are second rate. AMD has has two good years. Those good years were only because Intel went down the wrong path. That won't happen again.



    LOL. Do you remember the days when Intel were the Anti-Christ (or was that M$?)



    Who'da thunk we have Apple fan boys giving Intel the love?



    Lemon Bon Bon.
  • Reply 182 of 395
    Yes...pretty weird...considering almost everyone gave Apple ear ache for not going AMD at the time...



    Lemon Bon Bon.
  • Reply 183 of 395
    steviestevie Posts: 956member
    Quote:
    Originally Posted by DocNo42 View Post


    I don't think the average Apple user would notice or care if it was AMD inside instead of Intel inside.





    And for the rest, Apple will convince them that they now use the bestest CPUs in the whole world. Apple PR and advertising are second to none.
  • Reply 184 of 395
    myapplelovemyapplelove Posts: 1,515member
    Quote:
    Originally Posted by zeromeus View Post


    If Apple would like to use AMD to lower their price points, good for them. If Apple uses solely AMD, I'm dropping Apple.



    Sorry, have no intention to offend whatsoever, but this is an enormous garbage comment. In the next couple of years amd will again outpace intel by all accounts, because in the merger of cpu and gpu, intel outpaces amd by about 20% say in cpu but in gpu AMD absolutely kills intel, and the vapourware (turned software framework) that was larabee. Plus from a purely technical stand point amd will have a better cpu and gpu fusion, and intel is still struggling to make this work.



    AMD and APPLE will be a marriage made in heaven in say a couple of years, I had always believed this would eventually take place, and I 've been saying it a long time, in view of the fusion of cpu and gpu of course. My only concern is that now amd has gone fabless, or in any case spun off its fab business, and the viability of AMD in that interim time. So, if they can stay afloat and focused they will deliver, but they still have to go through the hard times. Apple could easily buy amd too, but I don't think they would like to scale up this much, apple that is.
  • Reply 185 of 395
    steviestevie Posts: 956member
    [QUOTE=TBell;1614273



    Apple isn't going to do anything that hurts performance.[/QUOTE]



    But for Apple, performance != speed. Instead, performance = battery life.
  • Reply 186 of 395
    bugsnwbugsnw Posts: 717member
    Even though some of the debate centers on current CPU technology, I'm going out on a limb that AMD has shown SJ some fancy stuff coming down the pipe that has piqued his interest, especially in regards to mobile technology.



    Let's champion this potential broadening of partners if only because it will hone all three companies to produce even better technology as well as help drive prices down further.
  • Reply 187 of 395
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by Stevie View Post


    But for Apple, performance != speed. Instead, performance = battery life.



    With Apple performance doesn't ONLY equal speed, it includes many metrics, including battery life.





    PS: 9 posts since creating account yesterday and an inability to format a quote correctly. I'm thinking iGenius is back again.
  • Reply 188 of 395
    Quote:
    Originally Posted by Marvin View Post


    4 cores and a GPU inside a package that consumes 2.5-25W.



    The package doesn't consume 2.5-25 W, one of the cores does.
  • Reply 189 of 395
    myapplelovemyapplelove Posts: 1,515member
    Quote:
    Originally Posted by iMacmatician View Post


    The package doesn't consume 2.5-25 W, one of the cores does.



    I think you are right, but you might be in the wrong forums, shouldn't you be at forums.macrumors.com?



    Here's some more recent information about where amd is headed:



    http://www.xbitlabs.com/news/cpu/dis...Late_2010.html



    If they have the extra funding in time to rump up production, implement the new cpu cores in the next iteration of llano, and come down 22nm, they have nothing to fear from intel. Intel on the other hand had better pull some amazing gpu tricks off their sleeves if they want to keep the best in class position in processors, and something tells me they won't.



    There's also the dark horse of nvidia that is rumored to be bringing along their x86 line up with integrated gpus. But I think they can only hope at also runs status, nothing more.
  • Reply 190 of 395
    Quote:
    Originally Posted by myapplelove View Post


    I think you are right, but you might be in the wrong forums, shouldn't you be at forums.macrumors.com?



    I have an account here too…



    Quote:
    Originally Posted by myapplelove View Post


    If they have the extra funding in time to rump up production, implement the new cpu cores in the next iteration of llano, and come down 22nm, they have nothing to fear from intel. Intel on the other hand had better pull some amazing gpu tricks off their sleeves if they want to keep the best in class position in processors, and something tells me they won't.



    I will say that AMD seems to be in the best position it's ever been in the last 4 years, while Intel doesn't look too good. That situation was reversed in 2006/2007.



    Intel has to get Larrabee working well and released otherwise they will have nowhere to go (I think that's why the project was started). The latest roadmaps and rumors say that Larrabee is scheduled for 2011 and a CPU with integrated Larrabee cores for 2015 (which is a delay from ~2008 and ~2012 respectively), that is, unless Larrabee gets canceled or delayed again (again). So AMD could have a GPU advantage for the next half decade or so.



    22 nm for AMD won't be here until late 2012 / early 2013, but Llano's only the start of Fusion. Like you say, I expect Llano's (2012?) successor to have Bulldozer cores. The modular approach to Bulldozer means that it's possible to do things like replacing one module with a bunch of GPU cores. That could be AMD's road to tighter CPU-GPU integration. There's speculation that future Bulldozer modules could have integrated GPU cores to perform FP calculations. Whatever way it is, I have a feeling AMD chips are going to see large FP gains every year. As for Intel, they may be able to keep up (AVX) with AMD in terms of CPU cores to CPU cores, but AMD has GPU cores, while Intel looks to only have GMA for the next few years (at least on-die). Haswell in 2013 may have on-package vector coprocessors so I'm anxious to see what those are.



    Intel's trying to go the whole way with Larrabee; AMD's going one step at a time with Fusion.
  • Reply 191 of 395
    MarvinMarvin Posts: 15,324moderator
    Quote:
    Originally Posted by iMacmatician View Post


    The package doesn't consume 2.5-25 W, one of the cores does.



    In that case, no quads on the low-end as it's 4 x (2.5-25W) + the GPU (probably 10W). From the link just posted, it seems dual core with the GPU will manage 30W but quad with the GPU can be as high as 59W and the cores adjust to fit.



    Clearly the integrated package makes sense for the low-end but with Intel, it was a no-go because their IGP is so poor. Apple can't upgrade the CPUs ever again in the low-end beyond the Core 2 series until Intel back down and let other chipsets work.



    This leaves AMD as the only option unless Intel buy NVidia and use their IGPs inside their processors. Some sites say the Llano has a GPU with 480 stream processors. AMD typically use more SPs than NVidia but that seems excessive in a GPU for this purpose. That's the same as the 40nm DX11 5650 (15-19W):



    http://www.notebookcheck.net/ATI-Mob...0.23697.0.html



    The Llano goes to 32nm and I imagine they will lower the clock speeds and even if they cut the speeds in half and get the power draw under 10W, that'll match the 320M. Still dual core but they'll get closer to Apple's projected 10 hour battery life and the high-end will definitely get the better performing AMD GPUs. It would be more of a side-step than an upgrade but gives Intel a clear message.
  • Reply 192 of 395
    Quote:
    Originally Posted by Marvin View Post


    In that case, no quads on the low-end as it's 4 x (2.5-25W) + the GPU (probably 10W). From the link just posted, it seems dual core with the GPU will manage 30W but quad with the GPU can be as high as 59W and the cores adjust to fit.



    It says "mainstream chips with two of four x86 cores will fit into 30W thermal envelope." That would be attained by low clock speeds.
  • Reply 193 of 395
    hmurchisonhmurchison Posts: 12,425member
    Infraction for Lemon Bon Bon for mentioning "Cheap" and "Apple" in the same sentence



    lol just kidding buddy



    Look the move to Intel wasn't so long ago that I have forgotten my rage and childish venting on these very boards. At that time I was not impressed with Intel processors (Netburst) and couldn't undertand why Apple was switching to them. I then began to do a bit of research and found out about the new core and it's design and what that was going to do for Intel. The rest, as they say, is history.



    I'm once again getting that same feeling. AMD's lineup TODAY isn't good in performance or performance/watt. However I think they know this and have already made changes. The ship should be back on course with Thuban coming out very soon and clearly they've bet the company on Bulldozer and Bobcat cores.



    If AMD is discussing providing hardware for Macs you can best believe that Bulldozer and Bobcat are the carrots being dangle. Every high volume PC vendor carries AMD and Intel based computers in their lineup. Apple is getting a bit too big to be Intel only.



    Frankly I'd love to see an Opteron based Mac Pro. Yes Intel has the performance crown today but I think the battle will be much tighter in 2011 and Intel refuses to beat AMD on pricing.
  • Reply 194 of 395
    Quote:
    Originally Posted by Marvin View Post


    Clearly the integrated package makes sense for the low-end but with Intel, it was a no-go because their IGP is so poor. Apple can't upgrade the CPUs ever again in the low-end beyond the Core 2 series until Intel back down and let other chipsets work.



    Personally, it's hard to see nVidia IGP chipsets being particularly relevant or scalable even if Intel gave nVidia a DMI license. DMI is meant as a low bandwidth link to a southbridge and only offers something like PCIe x4 performance. You can't expect to hang a modern GPU off DMI especially if OpenCL and synergy between CPU and GPU increases the need to pass data back and forth. What's more, IGPs work so well without dedicated memory because IGPs generally have the shared system memory controller on die so they have higher memory bandwidth and lower memory latency than the CPU. Any nVidia IGP chipset will have to go over the already bandwidth constrained DMI link to access memory which seems unworkable. If a nVidia IGP integrates it's own memory controller then you are duplicating functionality, increasing motherboard space for dedicated VRAM, and increasing power consumption all of which goes against Apple's reasoning for wanting a good IGP. Perhaps you could have a nVidia IGP connect through both DMI and the PCIe x16 links for bandwidth support, but that would certainly require complicated driver support, may break the ability to attach a discrete GPU in higher-end models, and would increase motherboard complexity and cost having to route all those traces. The current on die northbridge architecture with built in memory controller, PCIe controller, and IGP just isn't suited to support third-party IGP chipsets, but this isn't really Intel being malicious to nVidia since ATI is doing the same thing this is where the industry is leading. nVidia IGP chipsets is not just a licensing issue but a technical issue as well, and nVidia is simply naturally being squeezed out of the market. The writings been on the wall for years, which was why I kind of found it surprising that Apple seemed to be pushing so hard with the 9400M publicity since it was basically a dead end.
  • Reply 195 of 395
    Quote:

    Infraction for Lemon Bon Bon for mentioning "Cheap" and "Apple" in the same sentence



    lol just kidding buddy



    LOL. Heh...



    Looking forward to this year's round of iMac and Mac Pro round of updates, Hmurchison?



    I'm at least curious...



    ...who knows...they maybe AMDs in them... (tongue in cheek.)



    Lemon Bon Bon.
  • Reply 196 of 395
    Quote:

    Look the move to Intel wasn't so long ago that I have forgotten my rage and childish venting on these very boards. At that time I was not impressed with Intel processors (Netburst) and couldn't undertand why Apple was switching to them. I then began to do a bit of research and found out about the new core and it's design and what that was going to do for Intel. The rest, as they say, is history.



    I'm once again getting that same feeling. AMD's lineup TODAY isn't good in performance or performance/watt. However I think they know this and have already made changes. The ship should be back on course with Thuban coming out very soon and clearly they've bet the company on Bulldozer and Bobcat cores.



    If AMD is discussing providing hardware for Macs you can best believe that Bulldozer and Bobcat are the carrots being dangle. Every high volume PC vendor carries AMD and Intel based computers in their lineup. Apple is getting a bit too big to be Intel only.



    Frankly I'd love to see an Opteron based Mac Pro. Yes Intel has the performance crown today but I think the battle will be much tighter in 2011 and Intel refuses to beat AMD on pricing.



    A sensible post. I agreed with it totally. You get the feeling some updates have been hanging around waiting for Intel. We've been here before with Moto and IBM. AMD gives Apple options. And as Apple are on the cusp of hitting 4 million computers per quarter at some point this year...then it would be prudent to explore their options.



    Who knows what tricks AMD have up their sleeve. 4 million cpus is alot of business. That isn't pocket change. And AMD need the money. And they don't have the 'chipset' vendetta Intel have. They are bang per buck competitive.



    I've historically shouted for them in the fight with INtel. They're the underdog, I guess. Can't wait to see if anything comes of this. I bet we're a year or so from anything happening in the Mac range though?



    *nods.



    Lemon Bon Bon.
  • Reply 197 of 395
    trobertstroberts Posts: 702member
    Quote:
    Originally Posted by igxqrrl View Post


    As a computer engineer, you must surely know that compilers can *and do* optimize for a particular target, even within the same ISA. Also note that while both AMD and Intel support x86-64, they each also support their own extensions. Their implementations of virtualization and vector extensions have diverged.



    You must also know that the highest performing x86 compiler is generally acknowledged to be 'icc', and is provided by Intel. It is reasonable to assume that it is well tuned for Intel's products, and less well tuned for AMD's products.



    I believe Apple has already laid the groundwork for this type of scenario with:



    1 - Accelerate.framework, so developers can concentrate on their application while Apple deals with the specifics of each processor.



    2 - LLVM/Clang, because it is BSD-licensed, will allow Apple to optimize their compiler and not have to share code that would violate inside information about AMD CPUs and GPUs Apple is privy to through NDAs. Apple will still share "generic" code relating to LLVM/Clang so it can continue to be improved, but the "secret sauce" will be absent.
  • Reply 198 of 395
    Quote:
    Originally Posted by melgross View Post


    BS. No matter what, AMD is considered to be a second tier supplier.



    Sure, if you want a $400 piece of junk, get an AMD machine.



    But apple will sell a $600 - $800 crap desktop with a slow and small HDD low end on board video and low ram as well + a LAPTOP CPU at a price where you can good desktop cpu + a good video card.
  • Reply 199 of 395
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by Joe The Dragon View Post


    But apple will sell a $600 - $800 crap desktop with a slow and small HDD low end on board video and low ram as well + a LAPTOP CPU at a price where you can good desktop cpu + a good video card.



    It's funny, your comment infers that notebook-grade components should be less expensive than their desktop-grade counterparts. Note that in computing when you go to smaller, denser and more power efficient you end up paying a premium for parts. This is why no laptop is ever less expensive and more powerful than a desktop.
  • Reply 200 of 395
    hmurchisonhmurchison Posts: 12,425member
    Quote:
    Originally Posted by Joe The Dragon View Post


    But apple will sell a $600 - $800 crap desktop with a slow and small HDD low end on board video and low ram as well + a LAPTOP CPU at a price where you can good desktop cpu + a good video card.



    But not running OS X Joe. You continue to be awash in PC ideology without regard or understanding for why people buy computers in the first place. For task completion software is the most important asset to the bottom line which is why Apple is valued more than every hardware manufacturer (computer) in the US. The lone tech company ahead of them rose to power with software.
Sign In or Register to comment.