or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › The G5 and what it means for future Macs
New Posts  All Forums:Forum Nav:

The G5 and what it means for future Macs - Page 5

post #161 of 357
[quote]Originally posted by Transcendental Octothorpe:
<strong>

Yeah, I've always longed for some Creedence Clearwater Revival on a per-instruction basis.
</strong><hr></blockquote>

Heh. I always thought the band's name stood for "Condition Code Register".

:eek:
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #162 of 357
Um.. wow. DDR-333 on that bus means 10.6 GBps!!! That kind of bandwidth in the consumer sector is only seen on ultra high end graphics cards, such as the GeForce 4 Ti 4600. This kind of bandwidth, coupled with PowerPC's efficient memory architecture would create some serious competition for the PC.

Hope it's true!
post #163 of 357
I saw PC3200 on pricewatch. What is that like DDR400? that's insane put that in the G5s!
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #164 of 357
That DDR400 must be overclocked DDR333 memory... Kinda like PC150 is just overclocked PC133 memory. IIRC, Micron & the rest of the DDR developers have stated that DDR333 was going to be the end of the line before moving to DDRII.

Besides, DDR333 hasn't been out (at least not in a reliable form) for more than a few months... Given that there's damn near a year between release of new memory standards, it seems odd to see DDR400 so soon.
post #165 of 357
The problem with real PC400 in a DIMM package is that you can only reliably have 2 banks of memory (1 DIMM) working on the motherboard. A computer with only one DIMM is kind of limited as a professional workstation.
post #166 of 357
[quote]Originally posted by Rasputin:
<strong>Um.. wow. DDR-333 on that bus means 10.6 GBps!!! That kind of bandwidth in the consumer sector is only seen on ultra high end graphics cards, such as the GeForce 4 Ti 4600. This kind of bandwidth, coupled with PowerPC's efficient memory architecture would create some serious competition for the PC.

Hope it's true!</strong><hr></blockquote>


This thread has gotten pretty long, and I'm not sure which bus you are refering to. Certainly none of the ones discussed approach 10 GBytes/sec, and most exceed 10 Gbits/sec. The RapidIO bus discussed will be in the 2-4 GBytes/sec range. HyperTransport can theoretically reach 12 GBytes/sec, so that could be it... but the speed of the bus doesn't change the speed of DDR333 memory. A 64-bit wide DDR333 implementation will have a throughput of roughly 2.5 GBytes/sec. A 128-bit implementation could double that to rougly 5 GBytes/sec. I'd be surprised to see anything wider (pleasantly surprised, mind you). These are very good speeds.

There is no DDR400 spec, and DDR333 is rare enough that I don't expect to see it in an Apple machine. DDR266 is what I'm hoping to see, which ought to deliver about 2 GBytes/sec (or close to it). DDR-II is still a ways off yet. I suppose there is an outside chance Apple could use RAMBus, but I wouldn't be too happy about that... although it would be fast.

The nice thing about the new busses discussed here is that they are faster than the available memory, which means there is some room for growth.

[ 04-06-2002: Message edited by: Programmer ]</p>
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #167 of 357
Well the new macs will probably have 333 DDR because apple would have learned from it mistakes (with some luck ) and taken some action that the macs are at least on a par with future PCs. This is because apple is targetting professional markets where PCs are far more technically advanced.
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
post #168 of 357
[quote]Originally posted by mattyj:
<strong>Well the new macs will probably have 333 DDR because apple would have learned from it mistakes (with some luck ) and taken some action that the macs are at least on a par with future PCs. This is because apple is targetting professional markets where PCs are far more technically advanced.</strong><hr></blockquote>

Heh, you're optimistic. Actually there are good reasons why Apple might stop short of a DDR333 implementation. Its more expensive, less common, and there have been some technical problems with PC's using it. The performance gain over DDR266 is only about 25% (and if the 100% gain of DDR266 over SDRAM only turns into 10-15% in the benchmarks, then it'll be a realized improvement of only 2.5-3%) then it may not be worth it for Apple to pay the price and take the risks. It would be a better idea to spend the extra money to increase the base RAM configuration so that MacOSX runs faster.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #169 of 357
[quote]Originally posted by Programmer:
<strong>


This thread has gotten pretty long, and I'm not sure which bus you are refering to. Certainly none of the ones discussed approach 10 GBytes/sec, and most exceed 10 Gbits/sec. The RapidIO bus discussed will be in the 2-4 GBytes/sec range. HyperTransport can theoretically reach 12 GBytes/sec, so that could be it... but the speed of the bus doesn't change the speed of DDR333 memory. A 64-bit wide DDR333 implementation will have a throughput of roughly 2.5 GBytes/sec. A 128-bit implementation could double that to rougly 5 GBytes/sec. I'd be surprised to see anything wider (pleasantly surprised, mind you). These are very good speeds.
</strong><hr></blockquote>

Well, I sort of extrapolated the system for calculating bandwidth from graphics cards articles I've seen. That is, the hertz rating times sixteen. I'm sure you know it better than I .

It must be gigabits per second, I suppose.
post #170 of 357
[quote]Originally posted by Rasputin:
<strong>

Well, I sort of extrapolated the system for calculating bandwidth from graphics cards articles I've seen. That is, the hertz rating times sixteen. I'm sure you know it better than I .

It must be gigabits per second, I suppose.</strong><hr></blockquote>

No gigabytes is correct... here's why:

The x16 is because many graphics boards use a 128-bit bus. Some of them may even be using 256-bit buses... although the memory and bus usually have a fairly radical organization, which is possible due to the nature of graphics (which by its nature is very highly parallel and very deeply pipelined). The memory is also tightly coupled to the graphics chip & not expandable. Think of it as 4 x 64-bit busses, rather than one 256-bit bus. The latest nVidia highest end graphics cards manage a little over 10 GBytes/sec (as you mentioned) by using this mechanism.

This kind of configuration doesn't really work for the main CPU, although with the advent of onchip memory controllers and per-CPU memory some of this will move to the motherboard. If Dorsal's description of the system is accurate then a dual G5 machine, could be built each with its own 128-bit bus to its own private memory, would effectively have a 2 x 128-bit memory bus. If the memory is kept very close to the processor(s) then it might be possible that the bus can be wider and/or faster than 128-bit 133 MHz double pumped, but I won't speculate on that.

Adding more processors effectively widens the bus from a system viewpoint, although not from a single processor's viewpoint. This organization has a significant impact on the operating system, however, and it will be interesting to see how MacOSX deals with that. Processor's accessing eachother's memory would see a severe speed penalty -- on both processors. This would mean that either processes need to be bound to processors (limiting), and/or a sophisticated paging method for transfering whole virtual memory pages between processors is required... doing it at the cache level would involve a great deal of overhead.

It Dorsal is accurate and Apple gets the OS right, these machines will be very fast -- especially in multiprocessor configurations.

I should also add that they could also build these machines with two processors on the same daughtercard, sharing the same memory. This is probably more likely, but without knowing more about the on-chip memory controller its hard to say how it would work.

[ 04-07-2002: Message edited by: Programmer ]</p>
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #171 of 357
As long as the G5s are fast, compared to PCs, then nothing else matters, because that is the main reason apple isn't doing well in some professional markets.
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
post #172 of 357
[quote]Originally posted by mattyj:
<strong>As long as the G5s are fast, compared to PCs, then nothing else matters, because that is the main reason apple isn't doing well in some professional markets.</strong><hr></blockquote>

No computer company is doing well at this time.
I heard that geeks are a dime a dozen, I just want to find out who's been passin' out the dimes
----- Fred Blassie 1964
Reply
I heard that geeks are a dime a dozen, I just want to find out who's been passin' out the dimes
----- Fred Blassie 1964
Reply
post #173 of 357
[quote]Originally posted by mattyj:
<strong>As long as the G5s are fast, compared to PCs, then nothing else matters, because that is the main reason apple isn't doing well in some professional markets.</strong><hr></blockquote>

Is there any actual evidence (e.g. user surveys) for this oft-repeated assertion?
post #174 of 357
[quote]Originally posted by Nonsuch:
<strong>

Is there any actual evidence (e.g. user surveys) for this oft-repeated assertion?</strong><hr></blockquote>

Considering that PowerMac sales are half what they were in 1999, it is a valid assertion.
post #175 of 357
If there is no such thing as DDR 400MHZ, could someone tell me what the 500MHZ DDR L3 Cache in the Dual Tower is. I really don't know
post #176 of 357
I think the main difference is from having 500Mhz DDR as a cache, and having it as mia memory is that the memory at that speed is not reliable enough to be used in large amounts.

Someone correct me if I'm wrong, but it is only the very basics I understand. <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
Abhor the Stereotype, respect the Individual.
1.33Ghz 15" Powerbook: 80GB HD, 1GB RAM, OSX.4.7, Soundsticks II, 320GB LaCie FW800 EXT HD, iPod 20GB 4G
Reply
post #177 of 357
It's not so much the amount as it is the trace distances on the motherboard. Also RAM going through a DIMM slot will limit the speed. That's one reason why Graphics RAM is generally faster than main RAM.
post #178 of 357
[quote]If there is no such thing as DDR 400MHZ, could someone tell me what the 500MHZ DDR L3 Cache in the Dual Tower is. I really don't know <hr></blockquote>

The cache is SRAM, main memory would be DDR SDRAM. Different memory design, basically. (SRAM is much, MUCH more expensive than DDR SDRAM...)
post #179 of 357
[quote]Originally posted by mattyj:
<strong>Well the new macs will probably have 333 DDR because apple would have learned from it mistakes </strong><hr></blockquote>

I hope Apple has really learned from it's mistakes and will never again use memory that's not common in the market!
post #180 of 357
Nostradamus writes:
"Considering that PowerMac sales are half what they were in 1999, it is a valid assertion."

No actually it's not a valid assertion, considering all PC sales have been hit incredibly hard since 1999. Since then there's been an industry-wide slowdown as everyone knows. It isn't limited to PowerMacs, or due to the fact that some think they are underpowered.
"We're not gonna stop."
- Steve Jobs
Reply
"We're not gonna stop."
- Steve Jobs
Reply
post #181 of 357
I think Apple has done remarkably well, and MUCH better than most PC makers.
post #182 of 357
[quote]Originally posted by Gambit:
<strong>I think Apple has done remarkably well, and MUCH better than most PC makers.</strong><hr></blockquote>

Yes and if you consider the kind of crapy chips they received from Mot, they are true genius
post #183 of 357
[quote]Originally posted by Gamblor:
<strong>That DDR400 must be overclocked DDR333 memory... Kinda like PC150 is just overclocked PC133 memory. IIRC, Micron & the rest of the DDR developers have stated that DDR333 was going to be the end of the line before moving to DDRII.
</strong><hr></blockquote>

Well in a way it is overclocked PC2700, that is there is no JEDEC standard for a PC3200 module yet. JEDEC is the standards organization that determines the specifications for industry standard memory. They only recently (last week?) ratified the final design for PC2700 modules. And actually Micron and Samsung are the biggest proponents and as of yet the only companies which have stated they will produce and sell DDR I 400MHz memory. The others are waiting for DDR II.

The complication as others have stated is not in getting the silicon to run at 400MHz+ but to interface that silicon with the system. The reason there is almost no performance advantage for PC2700 and DDR400 over PC2100 is because the latency is being increased in order for the signal tolerance necessary for the modules to be certified. Thus while the memory runs faster, it takes more time for the signals to travel, thus rendering the advantage nearly null.

[ 04-08-2002: Message edited by: Eskimo ]</p>
post #184 of 357
[quote]Originally posted by Tarbash:
<strong>Nostradamus writes:
"Considering that PowerMac sales are half what they were in 1999, it is a valid assertion."

No actually it's not a valid assertion, considering all PC sales have been hit incredibly hard since 1999. Since then there's been an industry-wide slowdown as everyone knows. It isn't limited to PowerMacs, or due to the fact that some think they are underpowered.</strong><hr></blockquote>

Yes, but Apple's market share has been steadily declining both in the professional and consumer markets.
post #185 of 357
post #186 of 357
Oh I wanna jump in!
Since Microsoft's marketshare is 90% of the OS market, Apple is dead anyway. But wait a minute, more supply, less demand! Ha M$ is screwed (Like the logic used in this post).
"Its a good thing theres no law against a company having a monopoly of good ideas. Otherwise Apple would be in deep yogurt..."
-Apple Press Release
Reply
"Its a good thing theres no law against a company having a monopoly of good ideas. Otherwise Apple would be in deep yogurt..."
-Apple Press Release
Reply
post #187 of 357
Oh! And the G5 will be demoed at WWDC and shipped in august!
"Its a good thing theres no law against a company having a monopoly of good ideas. Otherwise Apple would be in deep yogurt..."
-Apple Press Release
Reply
"Its a good thing theres no law against a company having a monopoly of good ideas. Otherwise Apple would be in deep yogurt..."
-Apple Press Release
Reply
post #188 of 357
Seriously though, today on Maccentral an apple representative said this WWDC will be the best ever. For me the best was when the G4 was demoed. Maybe the G5 will be demoed at this one?
post #189 of 357
[quote]Originally posted by Outsider:
<strong>Seriously though, today on Maccentral an apple representative said this WWDC will be the best ever. For me the best was when the G4 was demoed. Maybe the G5 will be demoed at this one?</strong><hr></blockquote>

Just like MWSF 2002 was "Beyond the Rumor Sites. Way Beyond."
post #190 of 357
Well, with 'Ten' they've got the platform now to do some interesting stuff.

Bug fixes aside, it'll be interesting to see where Apple takes the 'Ten' os in terms of pushing the os envelope. I'm looking forward to seeing how the quartz engine can be further explored in the ten interface...and used in apps in general.

I'd love to see the G5 demoed. But maybe it will behind closed doors?

Perhaps for another few years the focus will continue to be developers and 'x'.

It's nice to hear talk of unix, java developers coming to 'our side' in droves.

With graphical parity with pcs in terms of Nvidia cards and open gl adoption in the os...the Mac just needs a good CPU hike to set the platform on fire alongside a suitable motherboard revamp.

After being stuck with the G4 for years now...I can hardly wait for the G5. still, whether a few months or 8 months. I guess it aint that long to wait for the 'ultimate' mac.

:cool:

Lemon Bon Bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #191 of 357
[quote]Originally posted by Lemon Bon Bon:
<strong>
With graphical parity with pcs in terms of Nvidia cards and open gl adoption in the os...</strong><hr></blockquote>

Apple may have reached parity in terms of nVidia & ATI hardware, but there are still some capabilities of that hardware which is not exposed to developers yet. They're working on at least some of it, but I wish they were more open to specific extensions like the hardware guys are free to do on the PC. The OpenGL ARB is just too slow to react, I have a hard time believing that they can keep up with DirectX going forward -- they'll always be a few steps behind.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #192 of 357
"The OpenGL ARB is just too slow to react, I have a hard time believing that they can keep up with DirectX going forward -- they'll always be a few steps behind."

An interesting point that has given me cause for concern, Programmer.

However, is it me...or is the graphical fidelity of Open Gl streets ahead of Direct X's 'cardboardy cut out' appearance..?

Particularly noticed in Unreal Tourney.

Direct X is aggressive on it's development curve.

But I don't see any games that are doing something Open Gl can't do? Mind you. I don't play that many of the latest games so...

Do Apple sit on the Open Gl development board like they do Firewire, Hypertransport consortiums etc? Maybe they could influence pace of Open Gl development?

Lemon Bon Bon :cool:
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #193 of 357
[quote]Originally posted by Lemon Bon Bon:
<strong>"The OpenGL ARB is just too slow to react, I have a hard time believing that they can keep up with DirectX going forward -- they'll always be a few steps behind."

An interesting point that has given me cause for concern, Programmer.

However, is it me...or is the graphical fidelity of Open Gl streets ahead of Direct X's 'cardboardy cut out' appearance..?

Particularly noticed in Unreal Tourney.

Direct X is aggressive on it's development curve.

But I don't see any games that are doing something Open Gl can't do? Mind you. I don't play that many of the latest games so...

Do Apple sit on the Open Gl development board like they do Firewire, Hypertransport consortiums etc? Maybe they could influence pace of Open Gl development?
</strong><hr></blockquote>

The visual quality of DirectX and OpenGL is essentially the same because they run on the same hardware. Any perceived differences you see will generally be specific driver, application, or art issues.

Apple is on the OpenGL ARB, and they are currently in the process of pushing a couple of new standard extensions through the approval process... but it is taking a heck of a long time. The graphics business is moving really fast, and the ARB has a history of not keeping up.

Now that nVidia & ATI are starting to get some serious market penetration with the advanced cards (geForce3, geForce4, Radeon8500, XBox) you will start seeing more games which take advantage of these features. By the end of this year there will be reasonably priced cards out that'll blow your mind. Apple should be front and center, leading the charge... instead they are still rendering Aqua on the CPU. <img src="graemlins/oyvey.gif" border="0" alt="[No]" />
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #194 of 357
"The visual quality of DirectX and OpenGL is essentially the same because they run on the same hardware. Any perceived differences you see will generally be specific driver, application, or art issues.

Hmmmm.

Apple is on the OpenGL ARB, and they are currently in the process of pushing a couple of new standard extensions through the approval process... but it is taking a heck of a long time. The graphics business is moving really fast, and the ARB has a history of not keeping up.

Here's hoping Apple can don the Steel toe cap enroute to ARB BUTT!

Now that nVidia & ATI are starting to get some serious market penetration with the advanced cards (geForce3, geForce4, Radeon8500, XBox) you will start seeing more games which take advantage of these features. By the end of this year there will be reasonably priced cards out that'll blow your mind. Apple should be front and center, leading the charge... instead they are still rendering Aqua on the CPU. "

Yeah. Inevitable. These features will become standard. It's the first clutch of card that can do the 'features' without showing a crawling framerate. The next range of cards will consolidate that and then some.

Yeesh. Geforce 5 anyone? That's what I hope is going to power my G5 purchase come the end of the year/next year.



Lemon Bon Bon

'Leading the charge'. Yes. I hope so. But their cpus and m/board are behind the curve. Not in front of it. It's been that way for a couple of years now. They're about due something to shake the PC market up. I just hope the G5 is everything the rumours say it is. That with a Nvidia '5' should equal a perfect '10'? (Okay, shoot me now...groan...)
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #195 of 357
Say, where did Dorsal go? Not that we weren't having enough fun without him/her...



Common Dorsal. Spill the beans on performance.

And these cases. Well fancy or not?

Show us the money.



Lemon Bon Bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #196 of 357
[quote] Common Dorsal. Spill the beans on performance.

And these cases. Well fancy or not?

Show us the money.
<hr></blockquote>

C'mon Apple! Demo the G5 in May. I'm ready to show you my money.

<img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
post #197 of 357
"C'mon Apple! Demo the G5 in May. I'm ready to show you my money."

Amen, brother.



Lemon Bon Bon
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
We do it because Steve Jobs is the supreme defender of the Macintosh faith, someone who led Apple back from the brink of extinction just four years ago. And we do it because his annual keynote is...
Reply
post #198 of 357
[quote]Originally posted by Lemon Bon Bon:
<strong>"C'mon Apple! Demo the G5 in May. I'm ready to show you my money."

Amen, brother.



Lemon Bon Bon</strong><hr></blockquote>

Yeah, C'mon apple. we want to replace our suite of 15 G4 400 AGPs. Give us the G5 NOW (and make sure its so fast that everyone thinks its a cruel hoax)
Greatly Insane
Reply
Greatly Insane
Reply
post #199 of 357
if they demo the g5, what will that do to p/m sales?

should just about kill 'em, don't cha think? further away from actual release, more damage done.
post #200 of 357
[quote]Originally posted by justinKaisse:
<strong>if they demo the g5, what will that do to p/m sales?

should just about kill 'em, don't cha think? further away from actual release, more damage done.</strong><hr></blockquote>

Absolutely. Why would they bother to preview it weren't for sale immediately?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › The G5 and what it means for future Macs