or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel's 64bit 4GHz Quad processor vs. IBM future CPU in Mac's.
New Posts  All Forums:Forum Nav:

Intel's 64bit 4GHz Quad processor vs. IBM future CPU in Mac's. - Page 2

post #41 of 73
EVERYONE NEEDS TO WAKE UP TOMMORROW, and somehow imagine what it would be like to run LINUX on their X86 machine. (Debian, RedHat, Mandrake, etc.)

They might get out their 2 or 3 year old computer and try it out. Go to Barnes and Noble or Borders and get a book, start out by installing the thing. Upgrade your hard drive if below the min. requirements...do the nasty and just do it.

They need to ask theselves does it work for me?
WILLYWALLOO'S: MostlyMacly: Rumors. Read about the timeline beyond our time.
PENFIFTEENPRODUCTIONS: We like what we do.
Reply
WILLYWALLOO'S: MostlyMacly: Rumors. Read about the timeline beyond our time.
PENFIFTEENPRODUCTIONS: We like what we do.
Reply
post #42 of 73
Quote:
Originally posted by wizard69
What I find perplexing is that we did not see such negativity when the industry changed form 16 bit to 32 bit hardware.

A minor point perhaps, but this is simply not so. I have been researching some computing history for my senior thesis, and spent a while reading old PC Weekly mags. To put it mildly, the 16-32 bit shift was hugely controversial, especially concerning compatibility. A lot of people were screaming bloody murder in the letters pages when it emerged that they would be left behind.

And remember, there were far better reasons for that switch, at least in terms of computing power. There aren't any such reasons yet for the 32-64 bit switch, but that sure as heck isn't gonna stop the whining.....
post #43 of 73
Quote:
Originally posted by wizard69
As far as consumer applications go I think it is only a matter of developers seeing enough 64 bit hardware in the wild for them to target 64 bit hardware. Games are one item that could make immediate use of the extended address space 64 bits offer. Media editing programms are not far behind. I would have to say that there are actually a number of potential consumer applications that could take advantage of 64 bits.

The only hold up we have is the adoption of 64 bit hardware. There has to be enough hardware out there to enable profitable sales of the software. This is where Apple has the potential to lead if they can transition their consumer lines quickly to 64 bit.

Few (if any) games need a 64-bit address space, and the trade-off of having 8 byte pointers would typically make this a poor design choice. There might be a few data structures that would benefit from a sparse 64-bit address space, but in most cases there are better alternatives that work more efficiently in a 32-bit space. The same is true for most consumer applications. Media editing programs (at least for video) are an exception that could benefit from the huge address space, but solutions are already in place to support data streaming so the payoff isn't too dramatic.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #44 of 73
Quote:
Originally posted by onlooker
You must have missed the presentation.

Notice that "32/64 bit" mealy-mouthedness. Prescott is a 32-bit processor PERIOD. It may have some small 64-bit aspects, but they're nothing new.

Intel is spreading FUD about 64-bit like a thick layer of manure.
post #45 of 73
We're talking about Nocoma, not Prescott, in case you still haven't noticed.
Matyoroy!
Reply
Matyoroy!
Reply
post #46 of 73
Thread Starter 
Quote:
Originally posted by G-News
We're talking about Nocoma, not Prescott, in case you still haven't noticed.

What he said.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #47 of 73
Quote:
Originally posted by cubist
Notice that "32/64 bit" mealy-mouthedness. Prescott is a 32-bit processor PERIOD. It may have some small 64-bit aspects, but they're nothing new.

Intel is spreading FUD about 64-bit like a thick layer of manure.

You really don't know what you're talking about. Prescott, using Socket 775 from memory, is a full 64 bit chip with 32 bit backward compatibility just like the Opteron is. The current Prescott has the same functionality it is just unusable like HT was in early PIVs. If you've read Intel's programmer notes you will even notice some very distinct similarities to AMD's notes for the Opterons so if Intel is producing 32 bit chips then so is AMD, and if you think that you need to go back and read up a bit.

Quote:
Originally posted by G-News
We're talking about Nocoma, not Prescott, in case you still haven't noticed.

Nocona is the same core as Prescott. Only differences really are MP support, bus speeds and cache sizes.
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
post #48 of 73
Quote:
Originally posted by Programmer
Few (if any) games need a 64-bit address space, and the trade-off of having 8 byte pointers would typically make this a poor design choice. There might be a few data structures that would benefit from a sparse 64-bit address space, but in most cases there are better alternatives that work more efficiently in a 32-bit space. The same is true for most consumer applications. Media editing programs (at least for video) are an exception that could benefit from the huge address space, but solutions are already in place to support data streaming so the payoff isn't too dramatic.

Actually the 32/64 in consumer desktops has only one function...to sell more computers/ Let'
s be honest, if you have a fairly recent vintage computer (about 3-4 years old now) and you go online, play solitare, send email, do spreadsheets and play MP#s, any computer will have more than enough power for you. The tech industry is just making up excuses to get you to buy a new one. Seriously, in those applications you probablly wouldn't even notice the speed increase from a 1ghz pentium to a 3.5 pentium.

By convincing people to edit video on their computer and play more complex games, it is the only way to get people to buy a new computer. The coputer indistry can't live on $500 computers alone.
post #49 of 73
Thread Starter 
Quote:
Originally posted by jade
Actually the 32/64 in consumer desktops has only one function...to sell more computers/ Let'
s be honest, if you have a fairly recent vintage computer (about 3-4 years old now) and you go online, play solitare, send email, do spreadsheets and play MP#s, any computer will have more than enough power for you. The tech industry is just making up excuses to get you to buy a new one. Seriously, in those applications you probablly wouldn't even notice the speed increase from a 1ghz pentium to a 3.5 pentium.

By convincing people to edit video on their computer and play more complex games, it is the only way to get people to buy a new computer. The coputer indistry can't live on $500 computers alone.

The same argument was used when desktops transitioned from 16 to 32 bit. Look how that turned out.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #50 of 73
you have to be honest enough to admit that 16bit is more than enough for word processing.
Matyoroy!
Reply
Matyoroy!
Reply
post #51 of 73
Quote:
Originally posted by G-News
you have to be honest enough to admit that 16bit is more than enough for word processing.

Not really. I have a document that's 1100+ pages long and has 400 000+ words, or around 2 million characters. That in fact isn't even the full document. I have that again in a separate file. A 16 bit computer simply couldn't handle that.
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
post #52 of 73
why not?
Matyoroy!
Reply
Matyoroy!
Reply
post #53 of 73
Yeah? Well I got a document that's 16 billion pages long, so I really need a 64-bit word processor.
post #54 of 73
Quote:
Originally posted by onlooker
The same argument was used when desktops transitioned from 16 to 32 bit. Look how that turned out.

People keep bringing that up, but what isn't understood is that the change in magnitude of the 16->32 vs 32->64 shifts is vastly different. Yes there are good uses for 64-bit address spaces, but most software doesn't need it at all and will actually lose some performance on identical hardware by using 8-byte pointers. Not to mention losing the ability to run on 32-bit hardware.

Furthermore, the ability to do 32-bit math efficiently was an enormous boon when moving from 16-bit machines. Thanks to the ubiquitous floating point hardware, with support for 32-bit and 64-bit float types, most of the math needs are already covered. Yes, some software will benefit from native 64-bit integer math... but the amount of software where that is actually the performance bottleneck is tiny.

For most users the reality of the arrival of 64-bit hardware is something of a yawner. The boys in marketing will try to dance around that, of course.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #55 of 73
Quote:
Originally posted by Programmer
.

For most users the reality of the arrival of 64-bit hardware is something of a yawner. The boys in marketing will try to dance around that, of course.

Yup the only people who think their computers are too slow are editing video, have computers from 1995, want more fps in video games or using dial up.

The average user really has no use for 64 bit.

And I will fully admit I have no use for 64 bit but that will not stop me from getting a g5 powerbook.
post #56 of 73
Thread Starter 
64 bit's can also help 3D rendering, Movie Editing, and if implemented correctly most apps that people complain about being to slow. I don't see it as a yawner,. More like an opportunity.
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
onlooker
Registered User

Join Date: Dec 2001
Location: parts unknown




http://www.apple.com/feedback/macpro.html
Reply
post #57 of 73
you know, 64bit still doesn't fix the fact that most systems have one or multiple bottlenecks, that software is poorly written or that harddrives happen to be slow, which seems more like the prime 3 reasons for things being slow, if you ask me.
Matyoroy!
Reply
Matyoroy!
Reply
post #58 of 73
Quote:
Originally posted by onlooker
64 bit's can also help 3D rendering, Movie Editing, and if implemented correctly most apps that people complain about being to slow. I don't see it as a yawner,. More like an opportunity.


How are you thinking that it helps...

3D rendering? Since most of the calculations are in 64-bit floating point, the 64-bit integer and 64-bit addressing isn't required. If you mean that the data sets are 2+ GB in size, then 64-bit addressing will certainly help, but the number of people doing that is tiny and the performance of processing that much data is still going to be painfully slow (or could be handled by streaming the data instead).

Movie editing? Since individual frames are much much less than 4 GB and streaming solutions are already in place the benefit of 64-bits here is more due to programmer laziness than any real performance improvement... unless you actually have >2 GB of RAM. For the pro video editors in the crowd 64-bit addressing becomes a big improvement, but this is a pretty small set of users. Important and vocal, but certainly not most of Apple's users. With a data set of this size, performance isn't going to be stellar and probably not much different than using a streamed mechanism.

64-bit hardware gives you very little in terms of speed. It does give new capability, but the number of users and software that need this capability is very small. There are definitely some places where 64-bit can make a big difference, but (I repeat) they are not common and they are not what most users will experience. The big ones for Apple are probably in scientific computing since that is an important area for Apple to gain mindshare and leverage their Unix basis.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #59 of 73
Quote:
Originally posted by Programmer
How are you thinking that it helps...

64-bit hardware gives you very little in terms of speed.

I disagree. As the majority of current 64bit systems happen to be in the mid-highend server.
These machines are generally used for highend apps that couldn't run effiently on a 32bit server.
Most of these machines will required 4GB memory per CPU and have more than 4 CPU's.
Most of these machines will have database functionallity and will be more than 4GB databases.
Imagine googles performance with a 32-bit machine.
A webserver that loads static content to memory had major gains over those loading from hard drives.
Encyption has huge performance advantages especially with 256, 512 or 1024bit encrption.

Using a 64bit system for surfing the net, reading e-mail an d doing basic word/excel stuff is a waste of time. In saying this you could it is also valid for a 32bit system.

Just use the right tool for the right job.

64Bit is old hat and Intel just has its marketing cap on.

Dobby.
post #60 of 73
I predict Apple will need 64 bit integer and 4+ Gbyte of memory for the next generation "Finder".
post #61 of 73
Quote:
Originally posted by dobby
I disagree. As the majority of current 64bit systems happen to be in the mid-highend server.
These machines are generally used for highend apps that couldn't run effiently on a 32bit server.
Most of these machines will required 4GB memory per CPU and have more than 4 CPU's.
Most of these machines will have database functionallity and will be more than 4GB databases.
Imagine googles performance with a 32-bit machine.
A webserver that loads static content to memory had major gains over those loading from hard drives.
Encyption has huge performance advantages especially with 256, 512 or 1024bit encrption.

Using a 64bit system for surfing the net, reading e-mail an d doing basic word/excel stuff is a waste of time. In saying this you could it is also valid for a 32bit system.

Just use the right tool for the right job.

64Bit is old hat and Intel just has its marketing cap on.

Dobby.

Uhh... okay, but when was I talking about servers? I was saying most users have no need for 64-bit's capabilities.

If you are talking about servers running huge databases then absolutely 64-bit systems are a tremendous benefit. The main win is the massive address space so that the entire database can live in RAM (or at least large portions of it).

For encryption SIMD units are usually a bigger win than 64-bit integer math.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #62 of 73
Sorry, I jumped the gun.
I can't actually think of any current benefit of a 64bit system for the average PC user.
I also think anthing more than 1GHz is a waste for the average PC user.
When 64bit becomes standard on the desktop (2010) then we will perhaps see new applications appear that utilise the extra addressing.

The benefit would still probably only occur when you have 4GB or more memory and while prices are cheaper I can't afford 4GB for a home PC (or work one).

Dobby.
post #63 of 73
Quote:
Originally posted by dobby
while prices are cheaper I can't afford 4GB for a home PC (or work one).

Don't worry, in a few years, you will be able to afford a consumer PC with 4 GBytes of RAM. Just by doubling every year from a typical 512 MB today, there will be PCs for sale with 4 GBytes of RAM in 3 years. You may or may not need it, but it is necessary for PC vendors to do it to maintain the upgrade cycle.
post #64 of 73
Hi Programmer;

I have to disagree with you completely with respect to the issue of games driving 64bit adoption at the consumer level. You would have to admit that games have been driving computer hardware performance for some time and in fact gamers are the ones that purchase the high end systems equiped with a large amount of memory and advanced GPU's. The reason is the reality of increased performance. That performance can be measured by the speed that the machine produces new frames or buy the quality of the rendered result.

64 bits in the gaming market will take off when the next hot game comes out with 64 bit support, and is only hot when that 64 bit support is enabled. It may take a year or two but once that game is out the market will quickly shift completely to 64 bit technology. It is a matter of being able to compete, 64 bit offers to many advantages to developers to ignore once the cat is out of the bag.

Being able to address all of that extra memory above 2Gb offers room for more data and more complex algorithms. So 64 bit games will be able to offer a richer IA experience, and a much richer visual experience. We are very likely to see around the same time as the 64 bits becomes significant remarketably enhanced displays at reasnable price points. Instead of 1280 x 1024 displays of today, affordable displays of > 3000 x 2000 pixels are not far off.

The advantages don't rest with just the game devlopers themselves. You now have much more room for tradtional OS actvities such as caching of slower devices such as CDROM drives. Larger memory systems mean far less paging. Many of the 64 bit systems support a full 32 bit address range for 32 bit processes, which is a significant improvement; in many cases doubling available memory to a application.

Take all of these things together along with other things happening in the gaming market and the realty of 64 bits sinks in pretty fast. Todays games will be as humorus as pong is now. But these are just arguments for games and 64 bit hardware, this does not mean that other software for the consumer space would not also benefit.

How much mor eefficient a program designed for 32 bit systems is over a 64 bit implementation is entirely dependant on the problem domain. If you end up doin a great deal of 64 bit math on a 32 bit system than you could end up wasting as much space with additional program code as you would with 64 bit pointers and simplified logic. Then we also have to mention the addtional maintenance issues. Even if the application itself does not benefit from 64 bit addressing and does not use it, the program still benfits from 64 bit technology. The most direct benefit is access to the full 32 bit address space instead of 1.5, 2 or 3 GB of memory.

On todays 32 bit hardware one can start to stress a machine with photoediting software which is much less than the demands of multimedia software. Sure the individual data elements are not that large, but there is a huge benefit to not haivng them paged out to backing store when editing. It is a matter of what sort of responsiveness is important to you and what you are willing to pay for. People in business to make money from these sorts of artistic applications are often willing to pay for the hardware that gives then real advantages.

Thanks
Dave



Quote:
Originally posted by Programmer
Few (if any) games need a 64-bit address space, and the trade-off of having 8 byte pointers would typically make this a poor design choice. There might be a few data structures that would benefit from a sparse 64-bit address space, but in most cases there are better alternatives that work more efficiently in a 32-bit space. The same is true for most consumer applications. Media editing programs (at least for video) are an exception that could benefit from the huge address space, but solutions are already in place to support data streaming so the payoff isn't too dramatic.
post #65 of 73
Again I have to disagree; the move to 64 bits is the next natural evolution of the microprocessor. It is fortunatly one that will serve us for a longer period of time than the move from 16 bits to 32 bits did.

Frankly though I have to wonder how anybody can maintain credibility after stating that you wouldn't notice a speed increase from a move from 1 Ghz to 3.5GHz. I could you not notice the speed differrence?? Maybe a move from 3.2 to 3.4 GHz, but the speed jump your talking about is huge, even if you only singled tasked the machine you would notice the difference.

You also have the industry a little backwards it is software that drives the demand for faster computing hardware. A faster computer, considered as an enity, is useless without software. It doesn't matter if the software is a bloated OS or the latest and gratest shoot'em up, software drives the need for faster hardware.

Dave


Quote:
Originally posted by jade
Actually the 32/64 in consumer desktops has only one function...to sell more computers/ Let'
s be honest, if you have a fairly recent vintage computer (about 3-4 years old now) and you go online, play solitare, send email, do spreadsheets and play MP#s, any computer will have more than enough power for you. The tech industry is just making up excuses to get you to buy a new one. Seriously, in those applications you probablly wouldn't even notice the speed increase from a 1ghz pentium to a 3.5 pentium.

By convincing people to edit video on their computer and play more complex games, it is the only way to get people to buy a new computer. The coputer indistry can't live on $500 computers alone.
post #66 of 73
That all depends on your word processing demands doesn't it. For just aobut everything I do vim works out fine, for many others it would be a joke. Go the 'What You See is What You Get' route and I don't think you can show that any 16 bit applications would be successfull. Give the user a GUI to oeprating in and you're pretty much completely out of the 16 bit generation.

Thanks
Dave


Quote:
Originally posted by G-News
you have to be honest enough to admit that 16bit is more than enough for word processing.
post #67 of 73
It is brought up because it is significant. Even if the major advantage of 64 bits is to enable 32 bit applications to address more memory than it is and advantage. It is also and advantage to be able to have more 32 bit applications in memory at the same time. So this is one advantage that can help the current generation of applications.

The advantages to the applications that can make use of 64 bit hardware far outweigh the disadvantages to software that might not make good use of the hardware. Considering that we are still seeing regular performance increases in microprocessors the impact to 32 bit software is not even worth considering. What is worth considering is that whole classes of existing applications now have the potential for running much more effiecently on a desktop machine. We live in a world where 1GB of memory is now the norm and it is farily easy to expand beyond that. Within less that 6 months, 2 GB should be the norm for a desk top computer with expansion beyond 4 GB economical. It would be silly to have this potential and not use it because of a aversion to 64 bit technology.

Outside of the existing application base we have the reality of the new applicaitons, such as the multimedia technolgy you have already mentioned. Beyond that are the applications that have developers just waiting for the right hardware. Look at 64 bits (low cost 64 bits) as an enabling technology. This is no differrent than the advent of 32 bit processors and the GUI and other computing advancements they enabled.

We are a long way from a HoloDeck computer or even a HAL. There are many applications between todays and the fantasy of science fiction that have yet to be developed. Many of those simply won't happen on 32 bit technology. The really good news is that the advent of 64 bit technology should last a while before something differrent comes along. Probally much longer that the run of 16 bit and 32 bit technology.

thanks
Dave


Quote:
Originally posted by Programmer
People keep bringing that up, but what isn't understood is that the change in magnitude of the 16->32 vs 32->64 shifts is vastly different. Yes there are good uses for 64-bit address spaces, but most software doesn't need it at all and will actually lose some performance on identical hardware by using 8-byte pointers. Not to mention losing the ability to run on 32-bit hardware.

Furthermore, the ability to do 32-bit math efficiently was an enormous boon when moving from 16-bit machines. Thanks to the ubiquitous floating point hardware, with support for 32-bit and 64-bit float types, most of the math needs are already covered. Yes, some software will benefit from native 64-bit integer math... but the amount of software where that is actually the performance bottleneck is tiny.

For most users the reality of the arrival of 64-bit hardware is something of a yawner. The boys in marketing will try to dance around that, of course.
post #68 of 73
You have some valid points, but they mostly only apply to pro users.

As for your idea of gamers buying 64bit systems, because it's cool, I happen to disagree.
I've been around gamers for a long time now and the number of "top notch always"-users is very small compared to the mid-range and "most bang for the buck" user class. by very small I mean 5% or less.

as for the gamers market adopting hardware capabilities.
Here you are wrong again. The developers will start to use technology several months or even years AFTER it is first available.
When Epic is making a 64bit version of their dedicated UT2k4 server, then THEY are adopting technology in order to be hip, and because they are interested if there is any benefit.
But they only started after the first desktop chips with 64bit integers came onto the market. Although 64bit server have been a reality for decades. And these are the machines that dedicated servers will be run on too. Nobody cared about 64bit then.
John Carmack designed the Doom3 engine with the capabilities of the FIRST GeForce in mind, later also making use of more modern GPUs. Most other games to date still don't use all the capabilities of GPUs that are 2 or 3 years old by now.

You see, the market reacts to each other in both ways:
Developers start to exploit hardware capabilities and consumers start to buy advanced hardware.
But these things both have a considerable lag to them, maybe 2 years average. During this time only early adopters and advocats will start to do either of the above.

Right now, 64bit offers obvious and dramatic improvements for things that require huge amounts of data, actively stored in RAM. For pretty much everything else, the benefit will not justify the price for another year average.

People who buy a 2.2GHz Athlon FX 51 now will think they hit the ultimate machine spec. In reality, the only thing they observe is a very fast 32 bit processor with dual channel onboard DDR RAM controller. Once 64bit applications that really benefit from the 64bitnes, not the extra RAM or the faster access to it, start to appear, the 2.2GHz will be the absolute bottomline. Similar to the original GeForce:
People who bought it, saw a speed increase in frames per second, because of the faster architecture, RAM, clockspeed.
They will never see the added graphical richness the cards capabilities offer, because frankly, they have already moved on to a newer card. A card they will have replaced for a newer one, once that cards features will be fully used, too.

In short: 64bit is the future, for all markets eventually.
But for most people, the bottomline now and within the next 1-2 years will be, that a move to a new, 64bit system is obsolete.

That is what I think Intel sees as well, and was therefore trying to push the high GHz, 32bit scheme for now, before jumping to even more complex and slower clocked 64bit systems, especially after seeing generation after generation of their Itanium chip fail.

So, instead of trying to prove that 64bit is necessary NOW, try to see things with regard of the 4th dimension, time, too.
Matyoroy!
Reply
Matyoroy!
Reply
post #69 of 73
Quote:
Give the user a GUI to oeprating in and you're pretty much completely out of the 16 bit generation.

You've never used System 7 or earlier, have you?
Matyoroy!
Reply
Matyoroy!
Reply
post #70 of 73
Quote:
You've never used System 7 or earlier, have you?

Macs have always had processors with 32 bit words (even the 68000 in the 128K is a 32bit CPU). Earlier 68K machines were only capable of 24bit addressing. The Memory control panel had a 24/32 bit addressing option for compatability reasons.
Stoo
Reply
Stoo
Reply
post #71 of 73
Indeed, my bad then.
Must have been confused by the fact they only had 16bit data paths.
Matyoroy!
Reply
Matyoroy!
Reply
post #72 of 73
Quote:
Originally posted by wizard69
64 bits in the gaming market will take off when the next hot game comes out with 64 bit support, and is only hot when that 64 bit support is enabled. It may take a year or two but once that game is out the market will quickly shift completely to 64 bit technology. It is a matter of being able to compete, 64 bit offers to many advantages to developers to ignore once the cat is out of the bag.

Being able to address all of that extra memory above 2Gb offers room for more data and more complex algorithms. So 64 bit games will be able to offer a richer IA experience, and a much richer visual experience. We are very likely to see around the same time as the 64 bits becomes significant remarketably enhanced displays at reasnable price points. Instead of 1280 x 1024 displays of today, affordable displays of > 3000 x 2000 pixels are not far off.

Using the amount of memory you are talking about would cripple game performance. Most games these days have relatively small footprints in memory (< 100 MB), and much of that isn't actually used on a continual basis, it is just loaded all at once as a convenience to the developer. On the game consoles, where they don't have so much RAM to throw around, games which require more data implement streaming systems which load the needed data just before it is needed (much as a streaming movie player like QuickTime does). The same thing can be done on a PC class machine but often isn't because it is simpler to rely on the OS virtual memory system... even though doing so can actually negatively impact performance unless it is carefully done. This means that games need to get 20-30 times "larger" before a 64-bit address space (and physical memory) is needed. This will take longer than you might think because the cost of creating that much content for a game is already quite high, and increasing it by such a degree is prohibative.

As for display sizes there is the problem of achieving decent framerates on huge displays, plus the resolution of the display doesn't correlate directly to the amount of memory required by the game. The game uses the GPU to render the image from geometry and textures which are largely independent of the display resolution. The trend is to make individual pixels more complex, rather than increase the number of pixels dramatically. This is partly because of the memory speed issue again... it is easier to make the GPU do more per memory unit than it is to increase the number of memory units it can process.


I don't disagree that 64-bit is coming and will likely be the norm eventually (many years), but the benefit derived from it for most software is far less than you seem to think. And don't forget that there is a cost for using 64-bit address spaces, compared to using 32-bit address spaces. I stand by my assertion that the arrival of 64-bit hardware & software in the desktop market is going to be mainly a marketing blitz. Do not expect to be wow'd by any 64-bit software that could not have been implemented as 32-bit. The large number of 32-bit machines will ensure that most developers are likely to stick with developing 32-bit software for quite a while yet.


This isn't just idle chatter for me. For several years now I've been trying to figure out what a 64-bit machine will buy us (as a game developer). The answer is "very little", and it probably isn't worth the cost. There are much more interesting parts of the hardware: FPUs, GPUs, SIMD, MP, SMT. Having 64-bit integer math doesn't help the PS2 much at all, and we're so far from needing >2 GB of memory that we wouldn't dream of wasting all that space making our pointers twice as big.
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #73 of 73
Quote:
Originally posted by Programmer
Using the amount of memory you are talking about would cripple game performance.

Excellent point. With 64Bit systems being able to address so much memory you can have too much memory as well.
The OS has to monitor the used and free memory and less performant CPUs (UltraSpace IIi or Itanic the first) bog down just monitoring the memory. Our E4800 with 12CPUS and 48GB memory takes ages (5-10mins) to recover from dropping a one of our 3 10GB databases from memory. We spent ages trying to speed it up but our Sun engineers said its just what happens (solaris 10 should be a bit quicker tho).

Dobby.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Intel's 64bit 4GHz Quad processor vs. IBM future CPU in Mac's.