IMO, it won't take long before the G5s, Opterons, and Itaniums are at their limits and people start screaming for 128-bit capabilities. My guess is that 128-bit chips will debut before 2020.
I bet we see quantum computers before we ever see a 128-bit computer (in the sense a G4 is 32-bit and a G5 is 64-bit)
Simulating the entire universe is impossible. Kurt Godel proved that no system can be entirely self referential and self containing. I believe this is better known as incompleteness theorem.
Personally, I have great faith in software engineers' ability to build software that maxes out the capabilities of any system available. Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago.
But that's _EXACTLY_ the point!
Today's _supercomputers_ aren't 128 bit yet. Check here:Top 500 Supercomputers And every computer on that list _combined_ doesn't use 2^64 Bytes of RAM.
Note that 1) none of them even report "bitness", 2) they _all_ report "Number of processors".
They've all been 64-bit so long it isn't worth mentioning, and research in supercomputers is dedicated more towards "How do we break this problem up to be more parallel" instead of "How soon can we go to 128 bits"... because it just isn't _that_ useful. And increased parallelism/#processors _is_.
On the other hand _GPUs_ and Altivec are "already there". Specialized NON-general purpose coprocessors. Game consoles claim various things too. But this is again 1 unit of a CPU, or a completely separate coproc -> not a complete shift across all logic units. There's just too much that is better served, and will always be better served, by moving less bits around.
And if Steve came up with something crazy-cool that did require more oomph and commissioned a brand new chip, we'd be better off with 1) extensions to the existing Altivec, and 2) more altivec units. But the core units we still have (both 64-bit integer and 64-bit FP) aren't going away anytime soon. Though there might be a LOT more of them.
Because, precisely as you said: "Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago." And supercomputers ain't anywhere near desiring 128-bitness yet.
. . . IMO, it won't take long before the G5s, Opterons, and Itaniums are at their limits and people start screaming for 128-bit capabilities. My guess is that 128-bit chips will debut before 2020.
I don't think so. The driving force for 64 bits is to address more memory. Let's look at the history of memory usage, roughly. Say it took about four years to go from 16 bits to 32 bits. Let's assume 8 bit memory was at its limit at the beginning of 16 bit CPUs and 16 bit memory was at its limit at the end of 16 bit CPUs. So memory usage doubled 8 times in those four years, or doubling every six months. This was a rapid increase in need for memory.
We are just getting to 64 bits now, so let's apply the same logic. Memory usage doubled 16 times in 20 years, or doubling every 15 months. The rate of memory growth has slowed down by a factor of two and a half. If the next memory growth has another factor of two and a half, it will double every 36 months, or three years. However, this time it must double 32 times before we need 128 bit memory.
Doing the math, 32 times three years is 96 years before we need 128 bit CPUs. This is a rough and crude estimate base on just a few data points, but it is better than nothing.
Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.
Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.
Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
If I remember correctly, there aren't enough atoms in the earth to provide full memory for a 128-bit computer's address space, does that put it in perspective?
128-bit addressing might be useful as soon as you need anything over 64. This is a need that's still a need that's a long way off, but if you wanted to address 8 exabytes rather than a mere 4 exabytes -- requiring a 65-bit address -- you might want to go straight to 128-bit address registers, even if most of those bits won't be used.
I did a hypothetical calculation of the physical size of a full 2^128 byte (2^131 bit) block of memory, considering each addressable bit as an iron atom, and came up with a cube roughly 3.2 kilometers, or 2 miles, on each side. Impractically huge, but not enough to consume all the Earth's atoms by far, even if you allow for many more than one atom per addressable bit. (I don't think we'll reach even the limits of 64-bit memory without developing molecular- and atomic-scale memory and processing components.)
Personally, I have great faith in software engineers' ability to build software that maxes out the capabilities of any system available. Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago. If consumer-grade computers can run it, somebody will figure out a way to make climate modeling or nuclear explosion simulation code somehow useful on the desktop in some sort of application (probably games, first). . .
Welcome back by the way. I definitely agree with this first part here, and it makes me think of the possibilities. If some bright software engineers can develop a 'must-have' 64-bit application, it might put Intel into a bit of a pickle. So far, Itanium is their only 64-bit strategy and Itanium is not going to be a desktop chip for a long time. Such an application would be a big boost for the Mac and of course AMD. You mention games; maybe games will be the first popular applications to exploit 64-bit advantages other than memory addressing.
Welcome back by the way. I definitely agree with this first part here, and it makes me think of the possibilities. If some bright software engineers can develop a 'must-have' 64-bit application, it might put Intel into a bit of a pickle. So far, Itanium is their only 64-bit strategy and Itanium is not going to be a desktop chip for a long time. Such an application would be a big boost for the Mac and of course AMD. You mention games; maybe games will be the first popular applications to exploit 64-bit advantages other than memory addressing.
Memory addressing is where its at when it comes to exploiting a 64-bit machine. Pretty much everything else can be done better either with FPU or SIMD. 64-bit integers just aren't that interesting, especially since 32-bit processors can do them, just somewhat less efficiently. If there is going to be a 64-bit "killer app" it will be something which takes advantage of the 64-bit address space.
Note that leveraging the 64-bit address space does not require having more than 4 GB of RAM in your machine.
Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.
Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.
Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.
Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.
Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
True, but this forum isn't about the latest miracle of science which will reach the market in 10+ years... its about Apple hardware that we may see in the next couple of years. 128-bit addressing is a long way off, probably at least as far off as today is from the dawn of the personal computer.
If you mean someday OS X will not run 32-bit applications, that would be a silly thing for Apple to do. The ability for 64-bit PPC processors to run 32-bit applications with zero penalty is a great selling point. If you read Programmer's earlier posting, you see that it is also very easy to keep this capability in the OS. Quote: "The required effort to support 32-bit mode is quite minor, and the reasons to support it are quite strong." So I say it is a permanent feature of the Mac.
No, I meant that five years down the road, the OS will be a full 64 bit OS, and not be backwards compatible for 32 bit machines. Of course the OS would still run 32 bit apps.
Its important to note the motivations for moving to higher bit architectures. It is elementary to conceive of information sets with more members than can be addressed by 8 or 16 bits. The jump to 32 bit is absolutely huge. 32bit numbers offer enough fidelity that nearly every easily quantifiable thing in your everyday life can be tracked. It is quite unlikely that you will ever have more than 2^32 distinct coins or bills come in contact with you hands during an entire lifetime. 32 bits would also be capable of managing stats on every heartbeat in an entire lifetime. (I think, someone do the math)
The point is, that the jump to 64bit computing will give us enough addressing to handle nearly all, quantifiable and perceivable events in a human lifetime. In fact, for some information sets, it will be enough to handle those events for everyone on the entire planet. Someone should really take the time to figure out an adequate example to illustrate this point. (damn I?m lazy) For instance, 64bit computing gives us enough address space to store biometric data on every human that has ever walked the surface of the planet. (Roughly 10 billion people)
Greater than 64bit computing? Sure? but only if you?re doing something like planetary scale modeling of weather down to the sub-quantum level? and even then, it maybe overkill.
Depends on what you mean by 128 bit comptuers G4's are 128 bit computers in that Altivec registers are 128 bit.
I can not see the practical reasons for computers with general purpose 128 bit integer registers. There is no reason why you would need such a memory address. I'm speaking as a programmer who DESPERATELY wished that 64bit x86 was as common as 64 bit PPC will soon be. I am not aware of any integer based problems that would need more than 64 bits.
Either way, the answeris pretty straightofrward: If we ever see 128 bit computers, then it won't be anytime soon.
I assume you are referencing Mr. Gates comment about not needing more than 640Kb of RAM. Well, one difference between 655,360 bytes (640K) and 18,446,744,073,709,551,616 (2^64 bytes) is that it is very difficult to find problems that need greater than 2^64 bytes of memory to be addressed. I program against data that can routinely come in at about 20 GB in size. Some of my company's "high end" users have hundreds of terrabytes of data that we will manipulate for them. Even if our users wanted to fit all their data into memory, they would still only use up something like 0.001 % of 64 bits. Nobody is even close to needing all 64 bits of address space, let alone a 64bit integer that acts as an ID value. What problems do you forsee needing such memory? Gate's comment was nonsense the day that he said it- a really long book series could get near 640K.
[Edit: this is sort of redundant after the previous post beat me too it...]
The difference between 8 and 16 bits is far less than the difference between 64 and 128 bits.
Infact, the jump to 128 bit from 64 bit would be:
4,294,967,296 times greater than the jump from 32 to 64 bit 281,474,976,710,656 times greater than the jump from 16 to 32 bit
72,057,594,037,927,936 times greater than the jump from 8 to 16 bit
64 bits will be 'good enough' for a very, very long time.
I wonder when the total ammount of RAM produced worldwide, throughout history, will be equivalent to what is addressable by 64bits. Anyone want to whip up these calculations? Have we already past that milestone?
[Edit: this is sort of redundant after the previous post beat me too it...]
I wonder when the total ammount of RAM produced worldwide, throughout history, will be equivalent to what is addressable by 64bits. Anyone want to whip up these calculations? Have we already past that milestone?
My guess is that no, we haven't. Here's my reasoning:
Assume that all computers made were 32 bit machines.
Assume that all computers made had the max RAM possible (4GB)
How many computers would have to be made to address 2^64 bits?
2^64 = 2^32 * 2^32, so if each machine has 2^32 RAM, then we would have to have made 2^32 of them, 4 billion (because 2^32 * 2^32 = 2^64. We haven't made 4 billion computers, and we certainly haven't been putting 4GB of RAM in each of the ones we have made, so no, we have not even made enought RAM in human history to fill up a 64 bit machine.
I assume you are referencing Mr. Gates comment about not needing more than 640Kb of RAM. Well, one difference between 655,360 bytes (640K) and 18,446,744,073,709,551,616 (2^64 bytes) is that it is very difficult to find problems that need greater than 2^64 bytes of memory to be addressed. I program against data that can routinely come in at about 20 GB in size. Some of my company's "high end" users have hundreds of terrabytes of data that we will manipulate for them. Even if our users wanted to fit all their data into memory, they would still only use up something like 0.001 % of 64 bits. Nobody is even close to needing all 64 bits of address space, let alone a 64bit integer that acts as an ID value. What problems do you forsee needing such memory? Gate's comment was nonsense the day that he said it- a really long book series could get near 640K.
Comments
Originally posted by TJM
IMO, it won't take long before the G5s, Opterons, and Itaniums are at their limits and people start screaming for 128-bit capabilities. My guess is that 128-bit chips will debut before 2020.
I bet we see quantum computers before we ever see a 128-bit computer (in the sense a G4 is 32-bit and a G5 is 64-bit)
[/absurd pedantism]
Originally posted by TJM
Personally, I have great faith in software engineers' ability to build software that maxes out the capabilities of any system available. Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago.
But that's _EXACTLY_ the point!
Today's _supercomputers_ aren't 128 bit yet. Check here:Top 500 Supercomputers And every computer on that list _combined_ doesn't use 2^64 Bytes of RAM.
Note that 1) none of them even report "bitness", 2) they _all_ report "Number of processors".
They've all been 64-bit so long it isn't worth mentioning, and research in supercomputers is dedicated more towards "How do we break this problem up to be more parallel" instead of "How soon can we go to 128 bits"... because it just isn't _that_ useful. And increased parallelism/#processors _is_.
On the other hand _GPUs_ and Altivec are "already there". Specialized NON-general purpose coprocessors. Game consoles claim various things too. But this is again 1 unit of a CPU, or a completely separate coproc -> not a complete shift across all logic units. There's just too much that is better served, and will always be better served, by moving less bits around.
And if Steve came up with something crazy-cool that did require more oomph and commissioned a brand new chip, we'd be better off with 1) extensions to the existing Altivec, and 2) more altivec units. But the core units we still have (both 64-bit integer and 64-bit FP) aren't going away anytime soon. Though there might be a LOT more of them.
Because, precisely as you said: "Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago." And supercomputers ain't anywhere near desiring 128-bitness yet.
Originally posted by TJM
. . . IMO, it won't take long before the G5s, Opterons, and Itaniums are at their limits and people start screaming for 128-bit capabilities. My guess is that 128-bit chips will debut before 2020.
I don't think so. The driving force for 64 bits is to address more memory. Let's look at the history of memory usage, roughly. Say it took about four years to go from 16 bits to 32 bits. Let's assume 8 bit memory was at its limit at the beginning of 16 bit CPUs and 16 bit memory was at its limit at the end of 16 bit CPUs. So memory usage doubled 8 times in those four years, or doubling every six months. This was a rapid increase in need for memory.
We are just getting to 64 bits now, so let's apply the same logic. Memory usage doubled 16 times in 20 years, or doubling every 15 months. The rate of memory growth has slowed down by a factor of two and a half. If the next memory growth has another factor of two and a half, it will double every 36 months, or three years. However, this time it must double 32 times before we need 128 bit memory.
Doing the math, 32 times three years is 96 years before we need 128 bit CPUs. This is a rough and crude estimate base on just a few data points, but it is better than nothing.
Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.
Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
Originally posted by AirSluf
If I remember correctly, there aren't enough atoms in the earth to provide full memory for a 128-bit computer's address space, does that put it in perspective?
128-bit addressing might be useful as soon as you need anything over 64. This is a need that's still a need that's a long way off, but if you wanted to address 8 exabytes rather than a mere 4 exabytes -- requiring a 65-bit address -- you might want to go straight to 128-bit address registers, even if most of those bits won't be used.
I did a hypothetical calculation of the physical size of a full 2^128 byte (2^131 bit) block of memory, considering each addressable bit as an iron atom, and came up with a cube roughly 3.2 kilometers, or 2 miles, on each side. Impractically huge, but not enough to consume all the Earth's atoms by far, even if you allow for many more than one atom per addressable bit. (I don't think we'll reach even the limits of 64-bit memory without developing molecular- and atomic-scale memory and processing components.)
Originally posted by TJM
Personally, I have great faith in software engineers' ability to build software that maxes out the capabilities of any system available. Today's desktops run programs that were in the realm of multimillion-dollar supercomputers a decade or two ago. If consumer-grade computers can run it, somebody will figure out a way to make climate modeling or nuclear explosion simulation code somehow useful on the desktop in some sort of application (probably games, first). . .
Welcome back by the way. I definitely agree with this first part here, and it makes me think of the possibilities. If some bright software engineers can develop a 'must-have' 64-bit application, it might put Intel into a bit of a pickle. So far, Itanium is their only 64-bit strategy and Itanium is not going to be a desktop chip for a long time. Such an application would be a big boost for the Mac and of course AMD. You mention games; maybe games will be the first popular applications to exploit 64-bit advantages other than memory addressing.
Originally posted by snoopy
Welcome back by the way. I definitely agree with this first part here, and it makes me think of the possibilities. If some bright software engineers can develop a 'must-have' 64-bit application, it might put Intel into a bit of a pickle. So far, Itanium is their only 64-bit strategy and Itanium is not going to be a desktop chip for a long time. Such an application would be a big boost for the Mac and of course AMD. You mention games; maybe games will be the first popular applications to exploit 64-bit advantages other than memory addressing.
Memory addressing is where its at when it comes to exploiting a 64-bit machine. Pretty much everything else can be done better either with FPU or SIMD. 64-bit integers just aren't that interesting, especially since 32-bit processors can do them, just somewhat less efficiently. If there is going to be a 64-bit "killer app" it will be something which takes advantage of the 64-bit address space.
Note that leveraging the 64-bit address space does not require having more than 4 GB of RAM in your machine.
Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.
Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.
Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
Originally posted by AirSluf
Still an implausible size for a desktop computer.
Not implausible for Future Hardware.
Originally posted by Nevyn
Not implausible for Future Hardware.
True, but this forum isn't about the latest miracle of science which will reach the market in 10+ years... its about Apple hardware that we may see in the next couple of years. 128-bit addressing is a long way off, probably at least as far off as today is from the dawn of the personal computer.
Originally posted by snoopy
If you mean someday OS X will not run 32-bit applications, that would be a silly thing for Apple to do. The ability for 64-bit PPC processors to run 32-bit applications with zero penalty is a great selling point. If you read Programmer's earlier posting, you see that it is also very easy to keep this capability in the OS. Quote: "The required effort to support 32-bit mode is quite minor, and the reasons to support it are quite strong." So I say it is a permanent feature of the Mac.
No, I meant that five years down the road, the OS will be a full 64 bit OS, and not be backwards compatible for 32 bit machines. Of course the OS would still run 32 bit apps.
The point is, that the jump to 64bit computing will give us enough addressing to handle nearly all, quantifiable and perceivable events in a human lifetime. In fact, for some information sets, it will be enough to handle those events for everyone on the entire planet. Someone should really take the time to figure out an adequate example to illustrate this point. (damn I?m lazy) For instance, 64bit computing gives us enough address space to store biometric data on every human that has ever walked the surface of the planet. (Roughly 10 billion people)
Greater than 64bit computing? Sure? but only if you?re doing something like planetary scale modeling of weather down to the sub-quantum level? and even then, it maybe overkill.
Originally posted by Yevgeny
Depends on what you mean by 128 bit comptuers G4's are 128 bit computers in that Altivec registers are 128 bit.
I can not see the practical reasons for computers with general purpose 128 bit integer registers. There is no reason why you would need such a memory address. I'm speaking as a programmer who DESPERATELY wished that 64bit x86 was as common as 64 bit PPC will soon be. I am not aware of any integer based problems that would need more than 64 bits.
Either way, the answeris pretty straightofrward: If we ever see 128 bit computers, then it won't be anytime soon.
That's what we all said years ago about RAM.
Originally posted by CubeDude
That's what we all said years ago about RAM.
I assume you are referencing Mr. Gates comment about not needing more than 640Kb of RAM. Well, one difference between 655,360 bytes (640K) and 18,446,744,073,709,551,616 (2^64 bytes) is that it is very difficult to find problems that need greater than 2^64 bytes of memory to be addressed. I program against data that can routinely come in at about 20 GB in size. Some of my company's "high end" users have hundreds of terrabytes of data that we will manipulate for them. Even if our users wanted to fit all their data into memory, they would still only use up something like 0.001 % of 64 bits. Nobody is even close to needing all 64 bits of address space, let alone a 64bit integer that acts as an ID value. What problems do you forsee needing such memory? Gate's comment was nonsense the day that he said it- a really long book series could get near 640K.
The difference between 8 and 16 bits is far less than the difference between 64 and 128 bits.
Infact, the jump to 128 bit from 64 bit would be:
4,294,967,296 times greater than the jump from 32 to 64 bit 281,474,976,710,656 times greater than the jump from 16 to 32 bit
72,057,594,037,927,936 times greater than the jump from 8 to 16 bit
64 bits will be 'good enough' for a very, very long time.
I wonder when the total ammount of RAM produced worldwide, throughout history, will be equivalent to what is addressable by 64bits. Anyone want to whip up these calculations? Have we already past that milestone?
Originally posted by dfiler
[Edit: this is sort of redundant after the previous post beat me too it...]
I wonder when the total ammount of RAM produced worldwide, throughout history, will be equivalent to what is addressable by 64bits. Anyone want to whip up these calculations? Have we already past that milestone?
My guess is that no, we haven't. Here's my reasoning:
Assume that all computers made were 32 bit machines.
Assume that all computers made had the max RAM possible (4GB)
How many computers would have to be made to address 2^64 bits?
2^64 = 2^32 * 2^32, so if each machine has 2^32 RAM, then we would have to have made 2^32 of them, 4 billion (because 2^32 * 2^32 = 2^64. We haven't made 4 billion computers, and we certainly haven't been putting 4GB of RAM in each of the ones we have made, so no, we have not even made enought RAM in human history to fill up a 64 bit machine.
Originally posted by Yevgeny
I assume you are referencing Mr. Gates comment about not needing more than 640Kb of RAM. Well, one difference between 655,360 bytes (640K) and 18,446,744,073,709,551,616 (2^64 bytes) is that it is very difficult to find problems that need greater than 2^64 bytes of memory to be addressed. I program against data that can routinely come in at about 20 GB in size. Some of my company's "high end" users have hundreds of terrabytes of data that we will manipulate for them. Even if our users wanted to fit all their data into memory, they would still only use up something like 0.001 % of 64 bits. Nobody is even close to needing all 64 bits of address space, let alone a 64bit integer that acts as an ID value. What problems do you forsee needing such memory? Gate's comment was nonsense the day that he said it- a really long book series could get near 640K.
Um, I was being sarcastic.