CONFIRMED: Apple will NOT use AMD in the near future

1235

Comments

  • Reply 81 of 103
    amorphamorph Posts: 7,112member
    [quote]Originally posted by MacLuv:

    <strong>



    Uh, yeah. Since you like bashing me for no reason just do me a favor and read my posts before slamming what I say, as I did clearly state I was talking about CONFIRMED.</strong><hr></blockquote>



    Just so you know: In Future Hardware, 'CONFIRMED' has a long history of being used sarcastically, or as an intentional exaggeration. You'll make much more sense of the posts here if you put a after every occurrence of the word here, especially if it's in all caps, and double-especially if it's followed by exclamation points.



    I agree that the legal obligation argument is overstated. Companies deny mergers and buyouts as a matter of course, right up until the press release announcing the buyout. The same companies usually make a raftload of promises about what will and what will not change about the purchased company, and most of these promises never materialize. Look at the MS buyout of Bungie for a particularly vivid example.



    The escape hatch all companies have is that anything that isn't a printed document with someone's John Hancock on it is a "forward-looking statement," and as such it's implicit that the company could change tack abruptly in response to market conditions (a necessary ability in the fast-paced, competitive world every CEO likes to invoke, natch). This gives companies the ability to make promises they have no intention of keeping and blame unforeseen complications and developments for the shift to the strategy that, in fact, they had planned from the get-go. Plausible deniability is a wonderful thing.



    [ 12-05-2002: Message edited by: Amorph ]</p>
  • Reply 82 of 103
    [quote]Originally posted by Amorph:

    <strong>

    Look at the MS buyout of Bungie for a particularly vivid example.

    </strong><hr></blockquote>

    Halo? Halo...? Bueller?

    [quote]Originally posted by Amorph:

    <strong>

    Plausible deniability is a wonderful thing.

    </strong><hr></blockquote>

    "I never had sexual relations with her..." Yes, but sometimes it gets taken to an extreme. Plausible deniability works in this case, but it's still a bit of a strech - Ruiz is a CEO, and with all the recent hubbub about CEOs knowing what's going on in their company, it's hard to believe that he could claim that he doesn't know if Apple is formally looking into using AMD products. Still, if they contacted someone lower on the chain (VP of engineering, or something) to see about the feasability of getting AMDs in Macs, it's "plausible" that Ruiz hasn't been "officially" informed about any deals between the two companies, and thus he can reasonably say he doesn't know anything about Apple looking into using AMD CPUs. But if an internal memorandum or email pops up showing otherwise, he could get himself in trouble.



    [Edit: changed "deny it" to read "reasonably say he doesn't know anything about Apple looking into using AMD CPUs." That should convey the meaning more directly.]



    [ 12-05-2002: Message edited by: CryhavocQ ]</p>
  • Reply 83 of 103
    macluvmacluv Posts: 261member
    As for as plausible deniability:



    Ruiz hasn't denied anything.



    As for "Confirmed":



    If everything is an inside joke to the regulars at AI, why are we debating anything at all?
  • Reply 84 of 103
    [quote]Originally posted by MacLuv:

    <strong>As for as plausible deniability:



    Ruiz hasn't denied anything.

    </strong><hr></blockquote>



    Ok, sure, he hasn't denied anything because no one has said that Apple and AMD are working together (a denial has to be sparked). However, he did answer a question by saying that he has no indication that Apple is thinking about using an AMD CPU in their [Apple's] product:

    [quote]Ruiz:<strong>First of all, I have no indication that Apple is even considering what we make.</strong><hr></blockquote>



    Plausible deniability does not necessarily imply that Ruiz is denying any collaboration. PD could mean that he does in fact know whether or not Apple is looking into using AMD products, but he is removed enough from the process that he could "plausibly" not actually know about it (or in this case, not have any "indication," whatever that means). It doesn't mean he has to deny anything.



    Let me ask you straight out: In your opinion, what does Ruiz's statement actually mean?



    I think you might be saying that Ruiz means absolutely nothing, and no meaning or intention can be interpretted from his statement... Is that correct?
  • Reply 85 of 103
    snoopysnoopy Posts: 1,901member
    A couple of these threads seem to prove what a strong hold marketing can have over the human mind. I'm sure many are persuaded that a processor's core performance is the most important factor, especially clock rate in GHz. It is no longer the MHz race. Well, it has produced a lot of good replies and information, and caused me to consider a couple things.



    There really is a limit to how fast a single core can go, and and how much performance can be squeezed out with techniques like SMT. It looks like Intel is determined to discover just where that limit is and get as close to it as possible. That's one approach. It appears that nothing else will satisfy those who are completely sold on the current marketing hype.



    However, it looks like IBM has a different approach. Single core performance is just one factor in computer performance. Why push the core beyond the point of diminishing returns, where single core performance begins to cost more and more for less and less improvement? IBM appears to be saying it is better to get good performance from a core, while keeping cost and power at lower levels. In this approach, increased computer performance comes from having more cores.



    If we wish to argue, let it be about which limit will be reached first? Will single core performance hit the wall before adding more cores becomes impractical, or will it be the other way around? Which approach will be most economical for getting high end computers? I do believe some of this has been addressed already, but it will never satisfy those who are completely sold on core specs.



    Edit: Excuse my double posting to another thread, where it seems more appropriate. Also to clear up one issue -- to have more than one core, I also include dual, quad and more single core processors. I am not just referring to multi-core chips.



    [ 12-05-2002: Message edited by: snoopy ]</p>
  • Reply 86 of 103
    nevynnevyn Posts: 360member
    [quote]Originally posted by snoopy:

    <strong>If we wish to argue, let it be about which limit will be reached first? Will single core performance hit the wall before adding more cores becomes impractical, or will it be the other way around?</strong><hr></blockquote>



    I'm not sure the 'limits' will be hardware limits as opposed to compiler limits. The Itanic's VLIW appears tough to run across typical x86 oriented code. The Itanium runs fine on code designed for it specifically, but it isn't clear how much hair-pulling went into the optimization.



    On the dual (or higher) core path, you are running independent threads on the two cores - each can be optimized without worrying about anything else. I have no idea what OmniWeb is doing in the 42 threads it has going right now - but with 16 cores/CPUS or wherever we end up down that path I don't need to. The work of separating things into threads was done well in advance of the code hitting the cores. Balancing between them can happen relatively quickly etc. Hyperthreading is essentially pretending to have more 'cores' also....



    The 'multiple cores' business is a bit wild in the first place. The 970 'single core' has 2 FPUs... A 970-dual-core would have 4 FPUS... how different is that really from simply saying 'this here chip is single core and it has 4 FPUs'? The only real external difference seems to be one 'core' can be defective and the chip can sill be sold. A 'single core CPU with 5 defective logic units' probably doesn't market as well
  • Reply 87 of 103
    brendonbrendon Posts: 642member
    [quote]Originally posted by MacLuv:

    <strong>Programmer--



    There's something I want to point out so I snipped this part of your post:



    Originally posted by Programmer:

    Do people really need faster PCs? For a great many of them the answer is no.





    This is definately refutable. It depends on who you're talking to. And from a marketing perspective, speed is very important.



    Idealism: If everybody just realized that we don't really need this performance increase then everybody would be happy.



    Reality: If Apple doesn't offer faster performance from it's machines, Apple will lose sales to those who want faster performance.



    Who wants faster performance: Anyone who wants faster performance.



    Why is this important: Duh.</strong><hr></blockquote>



    Is English your first language??



    Please read the snippit again SLOWLY, if you can see the error of your ways than you can understand English.
  • Reply 88 of 103
    macluvmacluv Posts: 261member
    [quote]Originally posted by Brendon:

    <strong>



    Is English your first language??



    Please read the snippit again SLOWLY, if you can see the error of your ways than you can understand English.</strong><hr></blockquote>



    Should I read it as slowly as they do in Indiana?





    <img src="graemlins/lol.gif" border="0" alt="[Laughing]" />
  • Reply 89 of 103
    matsumatsu Posts: 6,558member
    There are too many good reasons why any platform switch would not be to AMD processors, if we're even still talking about that, if not, IBL.
  • Reply 90 of 103
    [quote]Originally posted by snoopy:

    <strong>

    If we wish to argue, let it be about which limit will be reached first? Will single core performance hit the wall before adding more cores becomes impractical, or will it be the other way around? Which approach will be most economical for getting high end computers? I do believe some of this has been addressed already, but it will never satisfy those who are completely sold on core specs.

    </strong><hr></blockquote>



    Well, personally, I think the future will bring neural net processors, or maybe bio-neural gel-packs. <img src="graemlins/lol.gif" border="0" alt="[Laughing]" />



    Seriously, though, given the current trend in cognitive science whereby the mind-brain is being ever more closely modelled by massively parallel distributed processing, I'd say that's where we'll be going. But not 4-, 16, 32-, or even 64- way computing. I'm talking massively parallel with ridiculous redundancies.



    The human brain isn't particularly fast. It seems that it is so good at what it does because each neuron tends to act as a seperate "processing unit" (not exactly a CPU) that handles it's own specific threads - say, that's sort of like handing off specific tasks to DSPs....
  • Reply 91 of 103
    eskimoeskimo Posts: 474member
    [quote]Darwin/x86 only runs on Intel processors, not AMDs. And more tellingly, it came out in the last round of the antitrust trial that Ruiz had basically agreed to testify on behalf of Microsoft in exchange for Gates allowing Windows to run on the new Athlons.<hr></blockquote>



    Sorry to come into this thread late, but Amorph you are completely mistaken on this point. The AthlonXP was and is 100% compatible with the x86 ISA as well as OS's such as Windows XP. Jerry Sanders (not Hector Ruiz) agreed to testify on behalf of Microsoft because it is AMD's position that due to the incredible amounts of development it takes to make a complex microprocessor compatible with an OS that it would benefit the industry and consumers to have a single standard such as Microsoft Windows for the desktop. It has been purported by rumor sites that Jerry did this in return for special consideration by Microsoft, however that has never been proven. AMD didn't want Microsoft to make their OS compatible, what they wanted was for Microsoft to allow them to market the AthlonXP and Windows XP together and include certain optimizations in the OS that allows AMD to state truthfully that Windows XP was optimized to run faster on AthlonXP processor based computer systems.



    As for Hector's statement it should be noted that the context of his comment was in direct response to Apple's consideration of using AMD's 64bit solution. Apple maintains a healthy relationship with AMD and uses several AMD chips inside of it's various product lines.
  • Reply 92 of 103
    amorphamorph Posts: 7,112member
    I was wondering when you'd wander into this thread.



    [quote]Originally posted by Eskimo:

    <strong>



    Sorry to come into this thread late, but Amorph you are completely mistaken on this point. The AthlonXP was and is 100% compatible with the x86 ISA as well as OS's such as Windows XP. Jerry Sanders (not Hector Ruiz) agreed to testify on behalf of Microsoft because it is AMD's position that due to the incredible amounts of development it takes to make a complex microprocessor compatible with an OS that it would benefit the industry and consumers to have a single standard such as Microsoft Windows for the desktop. It has been purported by rumor sites that Jerry did this in return for special consideration by Microsoft, however that has never been proven. AMD didn't want Microsoft to make their OS compatible, what they wanted was for Microsoft to allow them to market the AthlonXP and Windows XP together and include certain optimizations in the OS that allows AMD to state truthfully that Windows XP was optimized to run faster on AthlonXP processor based computer systems.</strong><hr></blockquote>



    Thanks for the detailed correction. However, there's not much in the way of rumor here: It's a matter of record that Gates called, and in the same conversation that contained the request for testimony, he mentioned Windows working (optimally, I suppose) with the Athlon XP. This isn't an obvious tit-for-tat, but the possibility is certainly there. Especially considering Gates.



    If AMD's position is actually that a single OS would make chip development easier, they must really hate all the Linux people running their chips - can't have people demonstrating that the OS can be tailored for the processor, as well as the other way 'round.



    [quote]<strong>As for Hector's statement it should be noted that the context of his comment was in direct response to Apple's consideration of using AMD's 64bit solution. Apple maintains a healthy relationship with AMD and uses several AMD chips inside of it's various product lines.</strong><hr></blockquote>



    I was aware of this, but the confirmation is welcome.
  • Reply 93 of 103
    rickagrickag Posts: 1,626member
    [quote]Originally posted by Eskimo:

    <strong>

    Apple maintains a healthy relationship with AMD and uses several AMD chips inside of it's various product lines.</strong><hr></blockquote>



    MacLuv, now that that's settled, and Apple is already using AMD chips, you should be satisfied.







    Let's move on to more entertaining rumours
Sign In or Register to comment.