or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › AMD chief says Apple will eventually use AMD chips
New Posts  All Forums:Forum Nav:

AMD chief says Apple will eventually use AMD chips - Page 3

post #81 of 160
Quote:
Originally Posted by melgross

Yup. The Core, and Core 2 series of chips are wildly clockable. So clockable that many are wondering why Intel isn't running them at higher rates as standard.

Simple. Think about our new buzzwords for this year: Performance-per-watt. And performance-per-watt-per-dollar.

Intel has 'spec-ed the Yonah, Merom, Conroe, Allendale, Woodcrest all within specified thermal envelopes. They've settled on working to CPU multipliers off the base 266mhz bus speed -- so they've got these bunch of CPUs with their clock speed specified for their performance-per-watt-dominating quality.

It happens, we are lucky that in the Core2 Conroe/Allendale the headroom at 65nm and the general nature of the CPU is huge overclockability. The thing is that Intel can't clock the existing lineup higher as it is because once you increase the CPU frequency, clearly, the power consumption goes up.

Quote:
Originally Posted by melgross

I can only think that Intel will release them in higher clocked versions by the end of the year, when AMD has their newer kit out in fair numbers. That would damp AMD's introductions of their 65nm versions.

Higher clocked versions, or more powerful in benchmarks, at same or lower thermal envelopes. It's the performance-per-watt game...
post #82 of 160
Quote:
Originally Posted by sunilraman

Simple. Think about our new buzzwords for this year: Performance-per-watt. And performance-per-watt-per-dollar.

Intel has 'spec-ed the Yonah, Merom, Conroe, Allendale, Woodcrest all within specified thermal envelopes. They've settled on working to CPU multipliers off the base 266mhz bus speed -- so they've got these bunch of CPUs with their clock speed specified for their performance-per-watt-dominating quality.

It happens, we are lucky that in the Core2 Conroe/Allendale the headroom at 65nm and the general nature of the CPU is huge overclockability. The thing is that Intel can't clock the existing lineup higher as it is because once you increase the CPU frequency, clearly, the power consumption goes up.



Higher clocked versions, or more powerful in benchmarks, at same or lower thermal envelopes. It's the performance-per-watt game...

The power goes up rapidly on all chips when the freq. is raised. The voltage has to be raised, and the current rises faster. Can't help that.

But, watch, Intel will raise the clock when AMD comes out with their 65nm versions near the end of the year.
post #83 of 160
Quote:
Originally Posted by sunilraman

PPFT. English schoolboys are brats, plain and simple. And they don't grow up until their 40+. I'm just biased about the Brits. But I need to make a journey to the MotherLand one day. God save da Queen yo.

Well, being over here myself I reckon you're just about right.

If you do make the trip someday, expect to pay twice as much for everything (because we're all dolts and just do without complaint) and keep the Ali G impression for the most absurd possible moment. That stuff gets a lot of laughs around here, what with the high population of grown up schoolboys.
post #84 of 160
Quote:
Originally Posted by sunilraman

Just to throw in a bit of the PC Gamer perspective (I know, they're a small market segment) they're all wetting themselves looking to get Intel Conroes on solid overclockable, stable motherboards. PC Gamers are definitely in strong transition now from AMD to Intel. Only the die hard AMD fanboys are holding fast to the AMD FX's.

Good point. Something I've always wondered is which is larger: the pc gamer segment or the Mac market?

Gamers by and large spend a lot on software (games) and hardware (mobos, cpus and gpus) to keep their gamer karma in good stead.

Mac users tend to spend a fair bit on software (Mac apps sell quite strongly compared to their Windows equivalents as many Mac devs know) and hardware (all ourcredit card are belong to Apple, plus the odd penny for Crucial etc.) to ... maintain our Mac karma and more to the point get our work done too!

Both are minorities, for sure. But both are well catered for because hardware conscious gamers and the sort of alpha Mac users like us who attend such sites as these, simply spend more per head than the average consumers out there and actually take an interest in our machines.

My conclusion is that both segments are piling into Intel now. The former from AMD and the Athlon 64, the latter from PPC. Intel deserves some cred for offering the best option for both groups simultaneously. (Super clocking for gamers and sweet, sweet silence for us on the Mac.) I'm sure Otellini is lying back behind his super villian spread of Cinema Displays stroking a cat and plotting with a grin on his face.

I wish all the best for AMD in the long term. As their competition with Intel is what has led the x86 platform far beyond IBM and Moto's lacklustre stewardship of PPC, which in theory should have been the leader given good care. Competition is what will carry our Macs along with the rest of the industry forward. So don't write AMD off. At least, I guess, until the present roadmaps have panned out and Intel hits some snag with its currently amazing architecture... AMD have the power to design great chips from the ground up. They really need to do that again now to handle the Core family going into the future.

PS: Apple would have some rebranding to do re: AMD chips. In the murky future, they're going to have to stop calling x86 "Intel" all the damn time. Unless AMD pull out a trademark and we all decide it's become generic. If Creative invented the iPod, I can't see the problem with this.
post #85 of 160
Quote:
Originally Posted by melgross

The power goes up rapidly on all chips when the freq. is raised. The voltage has to be raised, and the current rises faster. Can't help that.

But, watch, Intel will raise the clock when AMD comes out with their 65nm versions near the end of the year.

Yup, exactly. It seems that they had specific performance targets within thermal envelope targets. It happens that there's all these enormous headroom if you're not obsessed about power draw, and decibels.

Agreed, when AMD brings out their 65nm stuff Intel will have higher clocked but within low-thermal envelope gear. I can't imagine clock speeds going DOWN from this point onwards, I really think the next stage (Core 3 say) will be 3ghz and upwards. But maybe I'm still locked into the MHZ race. Especially due to influence from visiting all those overclocking websites.
post #86 of 160
Quote:
Originally Posted by fuyutsuki

Good point. Something I've always wondered is which is larger: the pc gamer segment or the Mac market?

Heh 8) Very interesting. I think the PC gamer market is somewhat bigger, say 6-8% out of our Macintosh's 4.x %.

Quote:
Originally Posted by fuyutsuki

...My conclusion is that both segments [Gamers and Mac] are piling into Intel now. The former from AMD and the Athlon 64, the latter from PPC. Intel deserves some cred for offering the best option for both groups simultaneously. (Super clocking for gamers and sweet, sweet silence for us on the Mac.) I'm sure Otellini is lying back behind his super villian spread of Cinema Displays stroking a cat and plotting with a grin on his face.

Heh... great imagery re: super villain spread of CinemaDisplays ... Like the super bad guy in Inspector Gadget... with the cat. Muahha hahhaha h a.

Quote:
Originally Posted by fuyutsuki

I wish all the best for AMD in the long term. As their competition with Intel is what has led the x86 platform far beyond IBM and Moto's lacklustre stewardship of PPC, which in theory should have been the leader given good care. Competition is what will carry our Macs along with the rest of the industry forward. So don't write AMD off. At least, I guess, until the present roadmaps have panned out and Intel hits some snag with its currently amazing architecture... AMD have the power to design great chips from the ground up. They really need to do that again now to handle the Core family going into the future.

Despite my burning desire to get a Conroe and overclock the f*k out of it, I still acknowledge and admire what AMD has done up to this point. Certainly they're on the back foot for a year or more going forward, but is that not when sometimes come-from-behind-victories happen? Heh.

The key to Intel's success in regaining the CPU crown is going down to 65nm and 45nm. That gave them the jump on everyone. "Hitting the wall at 90nm" was a pretty catastrophic scenario in CPU-land. IBM/Moto couldn't hack it, AMD managed to, and still do, produce some nice stuff at 90nm with decent clocks and thermal envelopes in their current range.

It was clear for a few years IBM/Freescale would not be able to pull 65nm in any reasonable amount of time to save Apple.

Aside from the Core Microarchitecture and other chip designer-y stuff, is it not that they said that the way to keep Moore's Law going is to go down to 65nm and onwards to 45nm.

Beyond 45nm, I wonder what's on the horizon. And WTF happened to the promise of optical computing? Shuffling photons around could be much cooler (literally and figuratively).

[Side Geek Note] Apparently in Star Trek: Next Gen somewhere in there they talk about the computers, where imagine instead of electrons flying about you have photons or subatomic particles or something moving about, not only in real space (not fast enough), it moves in "subspace" (standard term for anything faster-than-light in the Star Trek universe).
post #87 of 160
Quote:
Originally Posted by melgross

AMD had some pretty bad chips out there for years. They weren't shunned simply Intel didn't want anyone using them.

I'm not sure what you mean. Some of the older Athlons ran pretty hot, but my understanding is that that was nothing in comparison to Prescot & Nocona. K6II and K6III were pokey but they weren't unreliable. I even had a 386 AMDs that worked fine. All the problems I had with AMDs were when they were mated to boards with VIA or SIS chipsets. Granted, I haven't owned many AMDs, since '98 I've been mostly using second hand or refurbished workstations, be they Alpha, Xeon or PMG5.
post #88 of 160
Quote:
Originally Posted by melgross

IF, and that is a big if, Apple ever moves to four socket, or higher, servers, it might be worth considering. But Apple has shown no inclination to do so.

Didn't Tulsa beat those 8xx Opterons pretty good? Granted, most of it seems to be thanks to the 64 MB cache on the IBM chipset but still. What could these chips do with HT and the integrated memory controller... Netburst must have been severly handicapped by latencies and bandwidth.
post #89 of 160
Quote:
Originally Posted by fuyutsuki

Well, being over here myself I reckon you're just about right.

If you do make the trip someday, expect to pay twice as much for everything (because we're all dolts and just do without complaint) and keep the Ali G impression for the most absurd possible moment. That stuff gets a lot of laughs around here, what with the high population of grown up schoolboys.

Heh. I'll keep it in mind for when I eventually if ever, make the trip to the UK. 8)
post #90 of 160
Quote:
Originally Posted by melgross

Heh. What we don't know, we don't, well, you know.

I was aware of every bit of it. Inaccurate statements are the source of many truly false claims.
What's the frequency, Kenneth?
Reply
What's the frequency, Kenneth?
Reply
post #91 of 160
Quote:
Originally Posted by gregmightdothat

How so?

Let's recap...

When Windows 95 came out, it was miles ahead of Mac OS (System 7.1 at the time). Protected memory and cooperative multitasking [edit: should be preemptive multitasking] to name a couple important features lacking in Mac OS at the time. I won't try to argue about which one was more stable (that's like comparing rotten apples to rotten oranges -- they both make you sick).

Over on the hardware side, PCs were offering much more for much less than the price of a Mac. Hence the reason why Apple toyed with the idea of "clones" briefly (since it had worked so well on the IBM PC side).

Apple didn't have anything in the pipeline (at the time) which would bring them back on par with the Wintel world (very much like the position AMD is in now). If it weren't for the deep coffers built up in the 80s, they'd have been dead in the water.

Of course, we all know how the next chapter of the story has turned out for Apple, which is why I'd never count out AMD.
 
Reply
 
Reply
post #92 of 160
Quote:
Originally Posted by auxio

Let's recap...

When Windows 95 came out, it was miles ahead of Mac OS (System 7.1 at the time).

Windows 95 came out in late 1995; System 7.5 came out in late 1994.

Quote:
Protected memory and cooperative multitasking to name a couple important features lacking in Mac OS at the time.

You mean preemptive multitasking. Mac OS had cooperative multitasking.

This is somewhat true, but the implementation in Windows wasn't exactly too great, and Microsoft wanted to move to NT to make it better. They failed multiple times. It wasn't until XP that they succeeded to merge the consumer and NT lines.

Quote:
Over on the hardware side, PCs were offering much more for much less than the price of a Mac.

That's a pointless assertion. You can't prove it either way.
post #93 of 160
Quote:
Originally Posted by auxio

Let's recap...

When Windows 95 came out, it was miles ahead of Mac OS (System 7.1 at the time). Protected memory and cooperative multitasking to name a couple important features lacking in Mac OS at the time. I won't try to argue about which one was more stable (that's like comparing rotten apples to rotten oranges -- they both make you sick).

Over on the hardware side, PCs were offering much more for much less than the price of a Mac. Hence the reason why Apple toyed with the idea of "clones" briefly (since it had worked so well on the IBM PC side).

Apple didn't have anything in the pipeline (at the time) which would bring them back on par with the Wintel world (very much like the position AMD is in now). If it weren't for the deep coffers built up in the 80s, they'd have been dead in the water.

Of course, we all know how the next chapter of the story has turned out for Apple, which is why I'd never count out AMD.

Oh ok, I had no idea where you were going with that analogy at first :P
post #94 of 160
Quote:
Originally Posted by sunilraman

Oh, and Paul Otellini is so in Steve's pocket -- Paul's virtually giving Intel chips away to Apple. Paul's like, here, take it man, take it alll..... Frack those Euro bastards* AMD..!!!

*AMD is European right? Or worse, now, European-Canadian with AMD-ATI

And Otellini is such an American sounding last name. Comes to mind right beside Smith and Jones...

And I can't remember where Intel CPUs are fabbed again... could you check yours for me?
 
Reply
 
Reply
post #95 of 160
Quote:
Originally Posted by Chucker

Windows 95 came out in late 1995; System 7.5 came out in late 1994.

I guess the timeline on Wikipedia is wrong then... maybe you should submit a correction to the article. I wasn't using a Mac at the time, so I don't know the exact dates. Regardless, Mac OS was still a historical footnote in any operating systems textbook at the time.

Quote:
Originally Posted by Chucker

You mean preemptive multitasking. Mac OS had cooperative multitasking.

Right, sorry, you got me. That's what happens when I skim through a Wikipedia article and post too fast....

Quote:
Originally Posted by Chucker

This is somewhat true, but the implementation in Windows wasn't exactly too great, and Microsoft wanted to move to NT to make it better. They failed multiple times. It wasn't until XP that they succeeded to merge the consumer and NT lines.

Sure. But at least it was there and working to some extent (ie. running more than one major app at a time wasn't quite as much of a gamble on the PC side).

Quote:
Originally Posted by Chucker

That's a pointless assertion. You can't prove it either way.

I remember shopping for a computer at the time and I found that Macs were at least $1000 more than an equivalent PC. Sure you can't directly compare the Motorola CPUs to the Intel CPUs (the slippery argument Apple fanboys love to use), but you also can't argue that the PC I got for $1000 cheaper would do everything I needed to do at the time as well as the Mac. Sure it wouldn't play Marathon, but it did play Leisure Suit Larry pretty well.
 
Reply
 
Reply
post #96 of 160
Quote:
Originally Posted by auxio

re: preemptive vs cooperative multitasking:
Sure. But at least it was there and working to some extent.

I think Windows 95 worked better in this regard, but I still switched to NT before too long.

I remember going to a Mac user's home around that time. When the computer was dialing in to the Internet, he told me not to click anywhere else or else the dialing might fail. That didn't leave a good impression on me.
post #97 of 160
Quote:
Originally Posted by JeffDM

I think Windows 95 worked better in this regard, but I still switched to NT before too long.

Sure, but how many games could you play under NT at the time?

If you want to move out of the realm of consumer OSes, then Linux also had a very good preemptive multitasking implementation at the time. As did many other UNIX-based OSes (NT was based on the Mach microkernel, which had been used in flavors of BSD UNIX since the late 80s).

Quote:
Originally Posted by JeffDM

I remember going to a Mac user's home around that time. When the computer was dialing in to the Internet, he told me not to click anywhere else or else the dialing might fail. That didn't leave a good impression on me.

I helped my wife (girlfriend at the time) set up her Mac around 1997. After experiencing a lot of crashes, we finally learned that running too many programs at once was simply a no-no, and that you had to actually set the maximum amount of memory large applications could use (either that or buy ludicrous amounts of RAM). And remember to "rebuild your desktop" every so often -- oh, and if all else fails, try "zapping the PRAM".
 
Reply
 
Reply
post #98 of 160
Quote:
Originally Posted by auxio

I guess the timeline on Wikipedia is wrong then... maybe you should submit a correction to the article. I wasn't using a Mac at the time, so I don't know the exact dates. Regardless, Mac OS was still a historical footnote in any operating systems textbook at the time.

The System 7 version history section on Wikipedia says that 7.1.2P was released in July 1994, and 7.5.1 in March 1995. That puts 7.5 somewhere in between; IIRC, it was released in late '94, but maybe it slipped until January or something. For whatever reason, Wikipedia does not give a date, or even a month, for it.

Wikipedia's Windows 95 page gives August 24, 1995, which sounds right to me. That definitely puts it after 7.5.

Quote:
Sure. But at least it was there and working to some extent (ie. running more than one major app at a time wasn't quite as much of a gamble on the PC side).

Yes, that's true, but in my experience, BSoDs were a much larger problem on the Windows side than system crashes were on the Mac side.

YMMV, of course.

Neither OS's foundation was particularly future-proof. Neither company was happy with the architecture. Apple tampered with A/UX, NuKernel, Copland, Pink and other projects at the time, and Microsoft with NT, and Apple even briefly considered moving to NT as well.

Quote:
I remember shopping for a computer at the time and I found that Macs were at least $1000 more than an equivalent PC.

What does "equivalent" even mean?

Quote:
Sure you can't directly compare the Motorola CPUs to the Intel CPUs (the slippery argument Apple fanboys love to use), but you also can't argue that the PC I got for $1000 cheaper would do everything I needed to do at the time as well as the Mac. Sure it wouldn't play Marathon, but it did play Leisure Suit Larry pretty well.

But a $100 computer can also "do everything you needed to do at the time". The difference is whether it's comfortable to work with.
post #99 of 160
Quote:
Originally Posted by auxio

Sure, but how many games could you play under NT at the time?

If you want to move out of the realm of consumer OSes, then Linux also had a very good preemptive multitasking implementation at the time. As did many other UNIX-based OSes (NT was based on the Mach microkernel, which had been used in flavors of BSD UNIX since the late 80s).

Some games worked, some didn't. It really wasn't a concern. It helped me cut back on the time wasted playing games, and the money spent constantly upgrading to play them. I think the much-improved stability and reliability gained was more than worth the trade-off. The difference compared to the UNIX varients was that I could run all of my productivity software. Most of the standard programs didn't have an adequate counterpart on Linux/BSD/etc., and I really didn't think they were very good desktop OS. Other than games, NT made a fine desktop OS at the time, even if it was more of a server/workstation OS.
post #100 of 160
Quote:
Originally Posted by auxio

Apple didn't have anything in the pipeline (at the time) which would bring them back on par with the Wintel world (very much like the position AMD is in now). If it weren't for the deep coffers built up in the 80s, they'd have been dead in the water.

Of course, we all know how the next chapter of the story has turned out for Apple, which is why I'd never count out AMD.

Nicely put. Back in the period in question I was an AMD fanboy through and through and that kind of tale certaingly pulls the heartstrings. The TEXAN chip maker (they're as German as Apple are Taiwanese) then finally pulled a tour de force with the Athlon at the end of the 90's and I still have my original one of those which I busted a good iMac's worth on alone at the time! So yes, Apple, AMD, and in fact every company worth its salt are good things for the competition they drive and the increase in the number of labs and engineers who craft tomorrow's finest mouthwatering new kit. That said, I'm still totally geared up for an Intel Mac once the financial gods remember me!

Motorola G4 all the way until then. It's not that bad really.
post #101 of 160
Quote:
Originally Posted by Chucker

The System 7 version history section on Wikipedia says that 7.1.2P was released in July 1994, and 7.5.1 in March 1995. That puts 7.5 somewhere in between; IIRC, it was released in late '94, but maybe it slipped until January or something. For whatever reason, Wikipedia does not give a date, or even a month, for it.

Wikipedia's Windows 95 page gives August 24, 1995, which sounds right to me. That definitely puts it after 7.5.

And 7.5 somehow made Mac OS leap ahead into the current state of OS technology? That's what my real argument was. Mac OS was 1980's technology until Mac OS X came out. Windows 95 was still head and shoulders better in many ways, even if it was still pretty bad.


Quote:
Yes, that's true, but in my experience, BSoDs were a much larger problem on the Windows side than system crashes were on the Mac side.

I can't remember how many times I saw the bomb/unhappy Mac on my wife's old Mac. It certainly felt like a lot more than I ever saw the BSoD when using Windows 95.

Quote:
Neither OS's foundation was particularly future-proof. Neither company was happy with the architecture. Apple tampered with A/UX, NuKernel, Copland, Pink and other projects at the time, and Microsoft with NT, and Apple even briefly considered moving to NT as well.

At least NT was a reality -- and there were plans in the pipeline to combine it with 95 (which essentially happened with Windows 2000 -- which I used happily for a few years before XP finally came out). Apple had nothing but vaporware in it's pipeline at the time.

Quote:
But a $100 computer can also "do everything you needed to do at the time". The difference is whether it's comfortable to work with.

It worked fine for me when I was using Linux on it to learn software development at the time. I'd flip over to Win 95 to edit reports and such with MS Word and play a couple of games.

You can't argue that Macs were any easier to deal with. Maybe the GUI was a bit simpler to get the hang of, but you still had extension conflicts, rebuilding the desktop, zapping the PRAM, and other Applisms that users had to deal with (but had no idea what they were doing).

Perhaps some of the components were better than PCs (like the audio system and the Apple monitors), but I didn't need that at the time. I needed a computer to learn software development, telnet into my computer labs, email, and write a couple of reports in Word with. The PC I bought happily filled those needs and I learned to use it pretty well. I learned all the quirks of Windows 95 and Linux rather than paying $1000 more and learning all of the quirks of Mac OS.
 
Reply
 
Reply
post #102 of 160
Quote:
Originally Posted by auxio

I see the AMD vs Intel battle as being similar in a lot of ways to Apple vs Microsoft.

Except that Intel does a lot more research, has more advanced technology, and a more elegant roadmap.
Cat: the other white meat
Reply
Cat: the other white meat
Reply
post #103 of 160
Quote:
Originally Posted by auxio

And 7.5 somehow made Mac OS leap ahead into the current state of OS technology? That's what my real argument was. Mac OS was 1980's technology until Mac OS X came out. Windows 95 was still head and shoulders better in many ways, even if it was still pretty bad.

Windows 95 was way behind Mac OS in other areas, so what are you getting at?

Quote:
I can't remember how many times I saw the bomb/unhappy Mac on my wife's old Mac. It certainly felt like a lot more than I ever saw the BSoD when using Windows 95.

That's too bad. I have a complete opposite experience.

Quote:
At least NT was a reality -- and there were plans in the pipeline to combine it with 95 (which essentially happened with Windows 2000 -- which I used happily for a few years before XP finally came out). Apple had nothing but vaporware in it's pipeline at the time.

No, it didn't happen untIl XP. 2000 was not a consumer OS. You can whine all you want about how you could use it at home, but it was never meant that way.

Quote:
Perhaps some of the components were better than PCs (like the audio system and the Apple monitors), but I didn't need that at the time.

You're twisting reality. Example? At the time, it wasn't unusual for a PC not to have a sound chip at all; you had to have a sound card on PCI. Built-in audio wasn't usual until a few years later.

It also wasn't until Windows 98 that it supported multiple monitors, and only so in a very buggy manner.

Quote:
I needed a computer to learn software development, telnet into my computer labs, email, and write a couple of reports in Word with. The PC I bought happily filled those needs and I learned to use it pretty well. I learned all the quirks of Windows 95 and Linux rather than paying $1000 more and learning all of the quirks of Mac OS.

Yeah, yeah.
post #104 of 160
Quote:
Originally Posted by Splinemodel

Except that Intel does a lot more research, has more advanced technology, and a more elegant roadmap.

True. I'm rather concerned if AMD can afford to follow Intel to 32nm, and Intel still has the NUMA, CSI and IMC cards to pull out if they can't do anything else. Though sometimes Intel just seems to go ahead on raw power, as in the case of Kentsfield and Clovertown. I can't see anything elegant with those implementations of a four way core.

I still wonder how effective NetBurst could have been if one had eliminated the latencies and bandwidth obstructions. I'm still amazed how Intel could increase the pipeline length of the Northwood with circa 50% and still retain the same performance at the same clock speeds. Those engineers are good.
post #105 of 160
Quote:
Originally Posted by jamezog

Perhaps... I can't see Apple offering both AMD and Intel at the same time - that's way too much like Dell's "customize it all" business model. I could definitely see Apple jumping ship, though, or at least threatening to jump ship in order to squeeze Intel a bit.

Meh... it's still good press (rumor-mongering) for Apple fiends.

Proud AAPL stock owner.

 

GOA

Reply

Proud AAPL stock owner.

 

GOA

Reply
post #106 of 160
Quote:
Originally Posted by sunilraman

Yup, exactly. It seems that they had specific performance targets within thermal envelope targets. It happens that there's all these enormous headroom if you're not obsessed about power draw, and decibels.

Agreed, when AMD brings out their 65nm stuff Intel will have higher clocked but within low-thermal envelope gear. I can't imagine clock speeds going DOWN from this point onwards, I really think the next stage (Core 3 say) will be 3ghz and upwards. But maybe I'm still locked into the MHZ race. Especially due to influence from visiting all those overclocking websites.

The MHz, now GHz race, is quite valid, as long as one is comparing oranges to oranges.

AMD K8 designs have better performance with higher clocked chips. So do Core 2 chips.

But one speed on a K8 doesn't compare directly to the same speed on a Core 2.

AMD used to name its chips on the clock rate that the performance would be if it were an Intel chip.

Of course, with the new chips, that doesn't work anymore. There are too many complexities in the new designs.
post #107 of 160
It will be good to use AMD's next generation chipsets in the Mac. Hypertransport access to secondary devices including a co-processor will greatly boost processing power for specific tasks. If Apple were to put in a special processor/board for improving the performance of their multimedia/whatever by a large magnitude, it would benefit everyone. Intel is trying to come up with a similar offering, but AMD is ahead in this game.

http://www.anandtech.com/cpuchipsets...oc.aspx?i=2768
Most of us employ the Internet not to seek the best information, but rather to select information that confirms our prejudices. - Nicholas D. Kristof
Reply
Most of us employ the Internet not to seek the best information, but rather to select information that confirms our prejudices. - Nicholas D. Kristof
Reply
post #108 of 160
Quote:
Originally Posted by melgross

The MHz, now GHz race, is quite valid, as long as one is comparing oranges to oranges.

AMD K8 designs have better performance with higher clocked chips. So do Core 2 chips.

But one speed on a K8 doesn't compare directly to the same speed on a Core 2.

AMD used to name its chips on the clock rate that the performance would be if it were an Intel chip.

Of course, with the new chips, that doesn't work anymore. There are too many complexities in the new designs.

Fair enough. But personally I want to see a 45nm 5ghz 50W TDP Core-based Quadcore Intel by end of 2007.
post #109 of 160
Quote:
Originally Posted by talksense101

It will be good to use AMD's next generation chipsets in the Mac. .. Intel is trying to come up with a similar offering, but AMD is ahead in this game.

Sorry, but where have you been this past year? Intel's Core and Core2 on mobile and desktop platforms now clealry edge out AMD.

Intel's "similar offering" is here, the Core MicroArchitecture and other goodies now starting to be realised on Core2 and going forward...

AMD's next-generation 65nm stuff will compete with Intel's next-generation 45nm stuff in a year's time and going forward.
post #110 of 160
Quote:
Originally Posted by sunilraman
[quote


The key to Intel's success in regaining the CPU crown is going down to 65nm and 45nm. That gave them the jump on everyone. "Hitting the wall at 90nm" was a pretty catastrophic scenario in CPU-land. IBM/Moto couldn't hack it, AMD managed to, and still do, produce some nice stuff at 90nm with decent clocks and thermal envelopes in their current range.

None of them "hacked" it. AMD least of all. AMD was so far behind everyone else in moving to 90nm ( they just finished a little while ago), that they were able to take advantage of the solutions that both IBM and Intel had found.

AMD's thermal's are pretty bad right now. up to 125 watts. Right there with the old Intel chips, and well above any of the new Core and Core 2 designs.

Quote:
It was clear for a few years IBM/Freescale would not be able to pull 65nm in any reasonable amount of time to save Apple.

Both companies could have, if they wanted to. But neither did.

Quote:
Aside from the Core Microarchitecture and other chip designer-y stuff, is it not that they said that the way to keep Moore's Law going is to go down to 65nm and onwards to 45nm.

Beyond 45nm, I wonder what's on the horizon. And WTF happened to the promise of optical computing? Shuffling photons around could be much cooler (literally and figuratively).


Now we're getting into some VERY interesting stuff.

Each time they more to a smaller die shrink, they are going to encounter even greater thermal problems. It's a matter of physics. Intel is working on vertical transistors, as a way of getting the same (or fairly close to) number of atoms into the gates. This is a tough road to travel. Other technologies are being worked on.

45nm will be attainable. But, after that it's a crapshoot. The best figuring at this time is that the smallest they can go with current thinking is somewhere between 32 and 20 nm. That's even with better materials and designs.

They used to think it was about 15 to 10 nm. but those thermal problems hadn't been considered. It was thought that the difficulty would be confined to being able to make masks at that size, and that would be the smallest they could go, even with the theoretical x-ray beam equipment that they had no idea how to produce.

But, now they know otherwise.

The leakage, and other problems which are even more daunting, increase geometrically, as the square (height x width) of the lines matter more than the width alone at these sizes.

There have been some major breakthroughs in optical computing. Just recently, a chip was produced in the lab that contains hundreds of lasers. this has been done before, but this is the first time it has been done on silicon. The holy grail.

But, it will take some time before this can be used for actual computing purposes, and it is mostly useful for transmitting information between chips than in doing actual computations.

for example, it will decrease the cost of bringing optical fiber the last mile, and the last hundred feet.

It will also be instrumental in allowing supercomputers to increase their performance, and in having an extermely high speed link between them.

Shades of the Forbin Project with Colossus.

Quote:
[Side Geek Note] Apparently in Star Trek: Next Gen somewhere in there they talk about the computers, where imagine instead of electrons flying about you have photons or subatomic particles or something moving about, not only in real space (not fast enough), it moves in "subspace" (standard term for anything faster-than-light in the Star Trek universe).

Star Trek was cute? Wasn't it?
post #111 of 160
Quote:
Originally Posted by Zandros

Didn't Tulsa beat those 8xx Opterons pretty good? Granted, most of it seems to be thanks to the 64 MB cache on the IBM chipset but still. What could these chips do with HT and the integrated memory controller... Netburst must have been severly handicapped by latencies and bandwidth.

According to more recent tests, as far as the two core models are concerned, when one goes to 4 and 8 sockets, memory bandwidth becomes much more important.
post #112 of 160
Quote:
Originally Posted by auxio

I guess the timeline on Wikipedia is wrong then... maybe you should submit a correction to the article. I wasn't using a Mac at the time, so I don't know the exact dates. Regardless, Mac OS was still a historical footnote in any operating systems textbook at the time.

Right, sorry, you got me. That's what happens when I skim through a Wikipedia article and post too fast....

Sure. But at least it was there and working to some extent (ie. running more than one major app at a time wasn't quite as much of a gamble on the PC side).

I remember shopping for a computer at the time and I found that Macs were at least $1000 more than an equivalent PC. Sure you can't directly compare the Motorola CPUs to the Intel CPUs (the slippery argument Apple fanboys love to use), but you also can't argue that the PC I got for $1000 cheaper would do everything I needed to do at the time as well as the Mac. Sure it wouldn't play Marathon, but it did play Leisure Suit Larry pretty well.

Windows 95 had a theoretical advantage over the Mac OS at the time, but did Wiki bother to tell you that it didn't have much practical advantage?

It crashed at least as much. If you ran a 16 bit program along with a "protected" 32 bit one, neither was protected.

By continuing the ISA bus inside the machines, the "Plug N Play" didn't work. Neither did USB. It was based on DOS, even though MS, at the time, denied it.

There ware so many problems that it's hardly worth mentioning all of them. By the time Apple came out with System 9, both 95, and 98, were outdated, and a total mess.

Then MS came out with "me".

Oh, the tales we can tell...
post #113 of 160
Quote:
Originally Posted by Zandros

True. I'm rather concerned if AMD can afford to follow Intel to 32nm, and Intel still has the NUMA, CSI and IMC cards to pull out if they can't do anything else. Though sometimes Intel just seems to go ahead on raw power, as in the case of Kentsfield and Clovertown. I can't see anything elegant with those implementations of a four way core.

I still wonder how effective NetBurst could have been if one had eliminated the latencies and bandwidth obstructions. I'm still amazed how Intel could increase the pipeline length of the Northwood with circa 50% and still retain the same performance at the same clock speeds. Those engineers are good.

Those two chips are simply an intermediary between the current two chip designs and new ones with built-in memory controllers.

They aren't as bad as all that either. The cache on Intel chips is not only larger, but is more spohisticated. Each core can get the entire cache of each of the two cores if needed, and can share otherwise. AMD can't do that. Also each of the two cores on a die can go to memory seperately. That mitigates some of the advantage of the on die controller.
post #114 of 160
Quote:
Originally Posted by talksense101

It will be good to use AMD's next generation chipsets in the Mac. Hypertransport access to secondary devices including a co-processor will greatly boost processing power for specific tasks. If Apple were to put in a special processor/board for improving the performance of their multimedia/whatever by a large magnitude, it would benefit everyone. Intel is trying to come up with a similar offering, but AMD is ahead in this game.

http://www.anandtech.com/cpuchipsets...oc.aspx?i=2768

There is nothing impressive here.

By AMD's own timeline, they are at least 18 months to two years behind Intel on moving over to 65 nm completely, which Intel had already managed. They haven't started yet. Intel will be moving to 45 around the end of 2007 to the beginning of 2008, so AMD will be two years behind that as well.

The K8L has already proven to be a disappointment to everyone, as the tests on systems have shown. About 0 to 5% improvement over the current K8 line doesn't give them much to brag about. With high end K8L designs using 125 watts, they are well behind there as well.

As far as some of the other technologies go, they have to catch up in cache technology. At best, what they have shown will come close.

The fact that they are still going to use 3 athrimetic units, where Intel now uses 4, will continue to dog them for quite a while. Going to 4 will just consume more power.

Intel will be moving to on die memory controllers with 45 nm.

I could go on, but it would be pointless.
post #115 of 160
Quote:
Originally Posted by sunilraman

Fair enough. But personally I want to see a 45nm 5ghz 50W TDP Core-based Quadcore Intel by end of 2007.

As do we all.
post #116 of 160
Quote:
Originally Posted by melgross

They aren't as bad as all that either. The cache on Intel chips is not only larger, but is more spohisticated. Each core can get the entire cache of each of the two cores if needed, and can share otherwise. AMD can't do that.

I know, but i feel that if someone wants to solve a problem, the answer would be "throw more cache at it", as opposed to the previous "throw some more clock cycles at it". Same case with the four way chip, "throw more cores at it". That's what I mean with not very elegant. It's a very easy thing to do, and the same reason I'm impressed how Intel's engineers handled the Prescott.

Quote:
Also each of the two cores on a die can go to memory seperately. That mitigates some of the advantage of the on die controller.

They can? I thought only Xeon chips supported dual FSBs. The problem is that they still have to go through the MCH which, as far as I know, will be a bottle neck.

Are we really sure on each process shrink drawing more power? Intel always insist on almost doubling the transistor count with each shrink, so I think that plays a large part. If you just keep the same chip, you would most likely reach a lower thermal envelope and reach higher clock speeds as a result. Transistor leakage is a problem that increases with each shrink though, but how much of an effect has it?
post #117 of 160
Quote:
Originally Posted by Zandros

...Transistor leakage is a problem that increases with each shrink though, but how much of an effect has it?

I think it is a major technical challenge in going to 45nm and 32nm or lower. AFAIK.
STUPID ELECTRONS!!! We need to find another subatomic particle to use in CPUs.
post #118 of 160
Quote:
Originally Posted by sunilraman

I think it is a major technical challenge in going to 45nm and 32nm or lower. AFAIK.
STUPID ELECTRONS!!! We need to find another subatomic particle to use in CPUs.

My vote is for the bovine-related particles, moo-ons, found only in the rare Upsidaisium element, but then we would need Moose and Squirrel for that... which could give AMD their chance.
"I used to be disgusted, but now I try to be amused."
Macbook Pro 2.2
Reply
"I used to be disgusted, but now I try to be amused."
Macbook Pro 2.2
Reply
post #119 of 160
Quote:
Originally Posted by Gambit

Marijuanna is not an hallucinogen.*


True but laced with just the right kind of special spices it makes for one heck of a delivery system.
Apple Fanboy: Anyone who started liking Apple before I did!
Reply
Apple Fanboy: Anyone who started liking Apple before I did!
Reply
post #120 of 160
"Marijuanna is not an hallucinogen"

why does everybody always correct me? this is the most pc anal retenant frikin mb i have ever been on... humor doesnt have to be accurate it just has to be silly ironic or humorous,people who are fans of george carlin will get that one... he would say that often during his early standups oh excuse me "stage performances" im gonna sneak over to a few you peoples houses and wipe my ass with your toothbrushes...

this message board just is not what it used to be. bow go ahead and bash me for my spelling and caps or what the fuck ever.....

your all so wonderfully reknown with knowledge about all the wonderful things that dont make a damn or wont even two years from now.... every frikin post i make is torn apart and for the negative.. crap man ive been on the internet since the early box modems and green and black screens i think i know how to make a somewhat cohesive post.

BLAH BLAH APPLE SUCKS I DONT LIKE ITV CAPS HURT MY EYES BICKER BICKER BICKER PC PC PC AND THEN MORE BICKERING AND THEN THEN WHEN IS APPLE GOING TO MAKE A CUBE POSTS BLAH BLAH BLUE RAY IS BETTER

"NO... "DEATH RAY IS BETTER" DIE DIE DIE

DID I MENTION THAT THIS BOARD NEEDS AND ENEMA?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › AMD chief says Apple will eventually use AMD chips