HDD manufacturers adopted the base 10 measure because that's what most people use ...
Absolutely not. The fist company to use this measure was Maxtor as far as I remember. All others were using the THE REAL AND THE ONLY MEANINGFUL 2 base measure -this is what you see in the get info panel. They even tried to sue Maxtor but finally gave up. This was yet another fool-game played by the dirty marketing people.
It was adopted for marketing reasons. It is confusing, especially as capacity grows. OSes multiple by 1024 between nomenclature, whereas manufactures use a metric 1000. You can easily see this in Leopard's Disk Utility as it states the capacity as GB and bytes next to each other showing that Base-2 is being used. My 1TB drive is not using 69GB for formatting.
I know exactly what it means, I've been dealing with it since I took Fortran in high school in 1966.
It's still a worthless argument. It solves nothing, and is meaningless as well.
Personally, I'd rather just use the base 10 meaning of one million meaning 1,000,000.
Base2 has no more meaning than does base 10, except that far fewer people understand, or appreciate it. Teckie geek arguments nonwithstanding, it's the average consumers perception that counts. That wasn't as true 20 years ago, but it sure is now.
Whatever, it still is an argument that solves nothing, and thus serves no purpose. Your programs use whatever they need, and your memory is still the same.
I know that that statement gets the geeks up in arms, but it's true nevertheless.
No one uses 100% of their HDD space, and shouldn't attempt to. So a few percent one way or the other there means nothing either.
Absolutely not. The fist company to use this measure was Maxtor as far as I remember. All others were using the THE REAL AND THE ONLY MEANINGFUL 2 base measure -this is what you see in the get info panel. They even tried to sue Maxtor but finally gave up. This was yet another fool-game played by the dirty marketing people.
That's nonsense. Base 2 is no more valid than is Base 6, or 8, or 10. These are just different means of expressing THE EXACT SAME THING!
Absolutely not. The fist company to use this measure was Maxtor as far as I remember. All others were using the THE REAL AND THE ONLY MEANINGFUL 2 base measure -this is what you see in the get info panel. They even tried to sue Maxtor but finally gave up. This was yet another fool-game played by the dirty marketing people.
I rally wish the IEC's binary byte abbreviation was more often used to differentiate itself from the SI standard of decimal byte.
I rally wish the IEC's binary byte abbreviation was more often used to differentiate itself from the SI standard of decimal byte.
Kebibyte = KiB
Mebibyte = MiB
Gibibyte =GiB
Tebibyte = TiB
When you can get the average computer user to understand all of this it will matter somewhat. But until then, it serves no purpose outside of tech forums.
The problem is that there are too many ways of counting the same thing.
Base2 has no more meaning than does base 10, except that far fewer people understand, or appreciate it. Teckie geek arguments nonwithstanding, it's the average consumers perception that counts. That wasn't as true 20 years ago, but it sure is now.
Whatever, it still is an argument that solves nothing, and thus serves no purpose. Your programs use whatever they need, and your memory is still the same.
It's still the same number of bytes, but the consumer confusion is real. We apparently don't expect consumers to understand the interest rate of their credit cards, their sub-prime ARMs or the TCO of their subsidized cellphones, so how can we expect the average consumer to understand the differences between a base-10 marketing of their storage as GB and the Base-2 format of their OS. People don't understand the differences and I don't think they should have to. We have ways of making it less complex by using a different term. I think we should use them.
When you can get the average computer user to understand all of this it will matter somewhat. But until then, it serves no purpose outside of tech forums.
The problem is that there are too many ways of counting the same thing.
I stated above, Apple is moving to the Base-10 representation of GB in Disk Utility in Snow Leopard. So my 1TB drive won't show up as 931.3GB anymore. It will in FInder as I would expect, but in DU it's 1TB (1,002,731,856 Bytes). I don't know what to make of the change.
It's still the same number of bytes, but the consumer confusion is real. We apparently don't expect consumers to understand the interest rate of their credit cards, their sub-prime ARMs or the TCO of their subsidized cellphones, so how can we expect the average consumer to understand the differences between a base-10 marketing of their storage as GB and the Base-2 format of their OS. People don't understand the differences and I don't think they should have to. We have ways of making it less complex by using a different term. I think we should use them.
Do you know just how difficult it is to get people to forget what they've been taught and learned over their lives? It's almost impossible. People know base 10. We should stick to base 10.
It took over 50 years to get the French to get comfortable with metric back in the 19th century, and that was after they banned the older systems. Despite the fact that metric was approved by Congress as an official measurement system here in the US in 1866, and written into law that industry should begin to adopt it, they still haven't, except where they have no choice, because people here would still rather use English measuring systems.
Sometimes we just have to bow to the inevitable. Since metric gives major advantages in many areas, it should be adopted, but this measurement system for memory offers no practical advantages to people. Therefor it won't be adopted.
And that's why arguing about it serves no purpose. I doubt that most people buying computers or HDDs even care.
I stated above, Apple is moving to the Base-10 representation of GB in Disk Utility in Snow Leopard. So my 1TB drive won't show up as 931.3GB anymore. It will in FInder as I would expect, but in DU it's 1TB (1,002,731,856 Bytes). I don't know what to make of the change.
I know that. I've seen it myself. Apple is just doing what I said will be done, moving to base 10. It isn't consistent yet, but eventually it will be.
A very few people are getting all out of shape over this, because they have some quirk in their heads that convinces them that they're somehow being cheated.
There will always be a few that submerge themselves in conspiracy theories. We should just ignore them when they do.
The rest have to understand that computing has long left the technically fascinated geeks behind.
Do you know just how difficult it is to get people to forget what they've been taught and learned over their lives? It's almost impossible. People know base 10. We should stick to base 10
How about listing the other standard for measurement or stating, like Apple does on their site, that 1GB = 1 Billion Bytes? I'd also like them to not list how many photos, songs, videos you can store on a drive without disclosing how arge each photo, song and video is that they are using for the comparison.
Quote:
It took over 50 years to get the French to get comfortable with metric back in the 19th century, and that was after they banned the older systems.
After the issues with French Revolutionary Time at the end of the 18th century I can understand why the decimal system wasn't accepted right away.
Quote:
Originally Posted by melgross
I know that. I've seen it myself. Apple is just doing what I said will be done, moving to base 10. It isn't consistent yet, but eventually it will be.
But can they do that with Finder and other apps? I don't think Base-2 listings will be going away anytime soon.
Quote:
There will always be a few that submerge themselves in conspiracy theories. We should just ignore them when they do.
There is no conspiracy, it's just good marketing and it's more relevant now for marketing than it used to be. If I am techtarded and looking for a large drive I am sure I'll grab the 1000GB drive over the 931GB drive. I hope I didn't come across as thinking it was a conspiracy, just that it is confusing for users. I can't tell you how many times I've had to explain it over the years.
How about listing the other standard for measurement or stating, like Apple does on their site, that 1GB = 1 Billion Bytes? I'd also like them to not list how many photos, songs, videos you can store on a drive without disclosing how arge each photo, song and video is that they are using for the comparison.
Manufacturers generally do this. There's an * after the number, which leads to the explanation as to how the calculation was done.
But think about this; We've had both ounces and liters on liquid measurements, and the same for solid measurements for quite some time now. But recently (I forgot where I read it, though I think it was in Science), a survey showed that very few consumers even READ the metric equivalents, much less understood them.
Simplicity is a virtue. That's an old statement, and it should tell us something.
For iPods, they always said that the number of songs was computed on the assumption that they were compressed at 128 Kbs, which was the standard at the time. Now they've begun to also give 256.
Quote:
After the issues with French Revolutionary Time at the end of the 18th century I can understand why the decimal system wasn't accepted right away.
That's exactly what I was saying.
Quote:
But can they do that with Finder and other apps? I don't think Base-2 listings will be going away anytime soon.
They could do whatever they want to. It's just about couple lines of code to do the math, and it might take a few microseconds.
Quote:
There is no conspiracy, it's just good marketing and it's more relevant now for marketing than it used to be. If I am techtarded and looking for a large drive I am sure I'll grab the 1000GB drive over the 931GB drive. I hope I didn't come across as thinking it was a conspiracy, just that it is confusing for users. I can't tell you how many times I've had to explain it over the years.
Of course not. It's simpler.
I wasn't referring to you. I was referring to the post made by shadow, which you must have read.
The company to first introduce the decimal multiplier was Western Digital. The purpose was NOT to make it more convenient for the users. The purpose was to fool the customer. It was an intended lie. I don't remember the disk sizes back then, but when you were comparing the disk drives it could look something like:
Maxtor: 80 MB, $200
Seagate: 80 MB $205
Western Digital: 82 MB, $200
Which is the best value?
Note that it makes sense to measure RAM and SSD in base 2 units because that's the way the memory is addressed and the memory modules produced. It is very likely that the RAM will be measured in base 2 units in the future.
There are tons of non-standard measures everywhere. To begin with the inch and the mile are non-standard measures. Fahrenheit is not standard. Minutes and hours are standard but not decimal. Horse power is the most stupid unit ever but is more widely used in car industry than the Watt. Electron volt (eV) is widely used by nuclear physics because it is much more convenient for calculations and human perception than joules. Base 2 was the norm for computer memory before some asshole from Western Digital decided to confuse the customers. Currently, the confusion is still here and there are tons of people asking "Why my HD is supposed to be NN GB but the OS shows MM GB?".
The fact that Leopard will display the disk-related memory in decimal units is a good decision (maybe), but the confusion with RAM will remain.
The company to first introduce the decimal multiplier was Western Digital. The purpose was NOT to make it more convenient for the users. The purpose was to fool the customer. It was an intended lie. I don't remember the disk sizes back then, but when you were comparing the disk drives it could look something like:
Maxtor: 80 MB, $200
Seagate: 80 MB $205
Western Digital: 82 MB, $200
Which is the best value?
Note that it makes sense to measure RAM and SSD in base 2 units because that's the way the memory is addressed and the memory modules produced. It is very likely that the RAM will be measured in base 2 units in the future.
There are tons of non-standard measures everywhere. To begin with the inch and the mile are non-standard measures. Fahrenheit is not standard. Minutes and hours are standard but not decimal. Horse power is the most stupid unit ever but is more widely used in car industry than the Watt. Electron volt (eV) is widely used by nuclear physics because it is much more convenient for calculations and human perception than joules. Base 2 was the norm for computer memory before some asshole from Western Digital decided to confuse the customers. Currently, the confusion is still here and there are tons of people asking "Why my HD is supposed to be NN GB but the OS shows MM GB?".
The fact that Leopard will display the disk-related memory in decimal units is a good decision (maybe), but the confusion with RAM will remain.
Standard is what's used. There are lots of standards.
But not one in a million cares about base 2 as a measure of memory, either volatile or not. If it's measured that way in the labs that's fine. But a million is a million, and to most people, that's 1,000,000. That's all they understand. There's no reason to confuse because it makes you feel better.
But a million is a million, and to most people, that's 1,000,000. That's all they understand. There's no reason to confuse because it makes you feel better.
What is confusing is when a one million bytes is listed as .95MB, so a million is no longer being represented as a million.
Guys... There's pretty good indicator of how they measure their storage... What's measured in base 10 divides by 10 usually. No big yen to fuss about, but one can find out what Apple ought to be ashamed of.
Guys... There's pretty good indicator of how they measure their storage... What's measured in base 10 divides by 10 usually. No big yen to fuss about, but one can find out what Apple ought to be ashamed of.
Yes, we know, that doesn't mean that people aren't confused by the variances in kilo/mega/giga/etc. for the same device. At least with Apple that will no longer be much of an issue. We'll see if other OSes follow suit.
Yes, we know, that doesn't mean that people aren't confused by the variances in kilo/mega/giga/etc. for the same device. At least with Apple that will no longer be much of an issue. We'll see if other OSes follow suit.
Comments
HDD manufacturers adopted the base 10 measure because that's what most people use ...
Absolutely not. The fist company to use this measure was Maxtor as far as I remember. All others were using the THE REAL AND THE ONLY MEANINGFUL 2 base measure -this is what you see in the get info panel. They even tried to sue Maxtor but finally gave up. This was yet another fool-game played by the dirty marketing people.
It was adopted for marketing reasons. It is confusing, especially as capacity grows. OSes multiple by 1024 between nomenclature, whereas manufactures use a metric 1000. You can easily see this in Leopard's Disk Utility as it states the capacity as GB and bytes next to each other showing that Base-2 is being used. My 1TB drive is not using 69GB for formatting.
I know exactly what it means, I've been dealing with it since I took Fortran in high school in 1966.
It's still a worthless argument. It solves nothing, and is meaningless as well.
Personally, I'd rather just use the base 10 meaning of one million meaning 1,000,000.
Base2 has no more meaning than does base 10, except that far fewer people understand, or appreciate it. Teckie geek arguments nonwithstanding, it's the average consumers perception that counts. That wasn't as true 20 years ago, but it sure is now.
Whatever, it still is an argument that solves nothing, and thus serves no purpose. Your programs use whatever they need, and your memory is still the same.
I know that that statement gets the geeks up in arms, but it's true nevertheless.
No one uses 100% of their HDD space, and shouldn't attempt to. So a few percent one way or the other there means nothing either.
Absolutely not. The fist company to use this measure was Maxtor as far as I remember. All others were using the THE REAL AND THE ONLY MEANINGFUL 2 base measure -this is what you see in the get info panel. They even tried to sue Maxtor but finally gave up. This was yet another fool-game played by the dirty marketing people.
That's nonsense. Base 2 is no more valid than is Base 6, or 8, or 10. These are just different means of expressing THE EXACT SAME THING!
Get over it.
Absolutely not. The fist company to use this measure was Maxtor as far as I remember. All others were using the THE REAL AND THE ONLY MEANINGFUL 2 base measure -this is what you see in the get info panel. They even tried to sue Maxtor but finally gave up. This was yet another fool-game played by the dirty marketing people.
I rally wish the IEC's binary byte abbreviation was more often used to differentiate itself from the SI standard of decimal byte.
Kebibyte = KiB
Mebibyte = MiB
Gibibyte =GiB
Tebibyte = TiB
I rally wish the IEC's binary byte abbreviation was more often used to differentiate itself from the SI standard of decimal byte.
Kebibyte = KiB
Mebibyte = MiB
Gibibyte =GiB
Tebibyte = TiB
When you can get the average computer user to understand all of this it will matter somewhat. But until then, it serves no purpose outside of tech forums.
The problem is that there are too many ways of counting the same thing.
Base2 has no more meaning than does base 10, except that far fewer people understand, or appreciate it. Teckie geek arguments nonwithstanding, it's the average consumers perception that counts. That wasn't as true 20 years ago, but it sure is now.
Whatever, it still is an argument that solves nothing, and thus serves no purpose. Your programs use whatever they need, and your memory is still the same.
It's still the same number of bytes, but the consumer confusion is real. We apparently don't expect consumers to understand the interest rate of their credit cards, their sub-prime ARMs or the TCO of their subsidized cellphones, so how can we expect the average consumer to understand the differences between a base-10 marketing of their storage as GB and the Base-2 format of their OS. People don't understand the differences and I don't think they should have to. We have ways of making it less complex by using a different term. I think we should use them.
When you can get the average computer user to understand all of this it will matter somewhat. But until then, it serves no purpose outside of tech forums.
The problem is that there are too many ways of counting the same thing.
I stated above, Apple is moving to the Base-10 representation of GB in Disk Utility in Snow Leopard. So my 1TB drive won't show up as 931.3GB anymore. It will in FInder as I would expect, but in DU it's 1TB (1,002,731,856 Bytes). I don't know what to make of the change.
It's still the same number of bytes, but the consumer confusion is real. We apparently don't expect consumers to understand the interest rate of their credit cards, their sub-prime ARMs or the TCO of their subsidized cellphones, so how can we expect the average consumer to understand the differences between a base-10 marketing of their storage as GB and the Base-2 format of their OS. People don't understand the differences and I don't think they should have to. We have ways of making it less complex by using a different term. I think we should use them.
Do you know just how difficult it is to get people to forget what they've been taught and learned over their lives? It's almost impossible. People know base 10. We should stick to base 10.
It took over 50 years to get the French to get comfortable with metric back in the 19th century, and that was after they banned the older systems. Despite the fact that metric was approved by Congress as an official measurement system here in the US in 1866, and written into law that industry should begin to adopt it, they still haven't, except where they have no choice, because people here would still rather use English measuring systems.
Sometimes we just have to bow to the inevitable. Since metric gives major advantages in many areas, it should be adopted, but this measurement system for memory offers no practical advantages to people. Therefor it won't be adopted.
And that's why arguing about it serves no purpose. I doubt that most people buying computers or HDDs even care.
I stated above, Apple is moving to the Base-10 representation of GB in Disk Utility in Snow Leopard. So my 1TB drive won't show up as 931.3GB anymore. It will in FInder as I would expect, but in DU it's 1TB (1,002,731,856 Bytes). I don't know what to make of the change.
I know that. I've seen it myself. Apple is just doing what I said will be done, moving to base 10. It isn't consistent yet, but eventually it will be.
A very few people are getting all out of shape over this, because they have some quirk in their heads that convinces them that they're somehow being cheated.
There will always be a few that submerge themselves in conspiracy theories. We should just ignore them when they do.
The rest have to understand that computing has long left the technically fascinated geeks behind.
Do you know just how difficult it is to get people to forget what they've been taught and learned over their lives? It's almost impossible. People know base 10. We should stick to base 10
How about listing the other standard for measurement or stating, like Apple does on their site, that 1GB = 1 Billion Bytes? I'd also like them to not list how many photos, songs, videos you can store on a drive without disclosing how arge each photo, song and video is that they are using for the comparison.
It took over 50 years to get the French to get comfortable with metric back in the 19th century, and that was after they banned the older systems.
After the issues with French Revolutionary Time at the end of the 18th century I can understand why the decimal system wasn't accepted right away.
I know that. I've seen it myself. Apple is just doing what I said will be done, moving to base 10. It isn't consistent yet, but eventually it will be.
But can they do that with Finder and other apps? I don't think Base-2 listings will be going away anytime soon.
There will always be a few that submerge themselves in conspiracy theories. We should just ignore them when they do.
There is no conspiracy, it's just good marketing and it's more relevant now for marketing than it used to be. If I am techtarded and looking for a large drive I am sure I'll grab the 1000GB drive over the 931GB drive. I hope I didn't come across as thinking it was a conspiracy, just that it is confusing for users. I can't tell you how many times I've had to explain it over the years.
How about listing the other standard for measurement or stating, like Apple does on their site, that 1GB = 1 Billion Bytes? I'd also like them to not list how many photos, songs, videos you can store on a drive without disclosing how arge each photo, song and video is that they are using for the comparison.
Manufacturers generally do this. There's an * after the number, which leads to the explanation as to how the calculation was done.
But think about this; We've had both ounces and liters on liquid measurements, and the same for solid measurements for quite some time now. But recently (I forgot where I read it, though I think it was in Science), a survey showed that very few consumers even READ the metric equivalents, much less understood them.
Simplicity is a virtue. That's an old statement, and it should tell us something.
For iPods, they always said that the number of songs was computed on the assumption that they were compressed at 128 Kbs, which was the standard at the time. Now they've begun to also give 256.
After the issues with French Revolutionary Time at the end of the 18th century I can understand why the decimal system wasn't accepted right away.
That's exactly what I was saying.
But can they do that with Finder and other apps? I don't think Base-2 listings will be going away anytime soon.
They could do whatever they want to. It's just about couple lines of code to do the math, and it might take a few microseconds.
There is no conspiracy, it's just good marketing and it's more relevant now for marketing than it used to be. If I am techtarded and looking for a large drive I am sure I'll grab the 1000GB drive over the 931GB drive. I hope I didn't come across as thinking it was a conspiracy, just that it is confusing for users. I can't tell you how many times I've had to explain it over the years.
Of course not. It's simpler.
I wasn't referring to you. I was referring to the post made by shadow, which you must have read.
There are 10 types of people; those who understand Binary and those who don't.
Very good!
The company to first introduce the decimal multiplier was Western Digital. The purpose was NOT to make it more convenient for the users. The purpose was to fool the customer. It was an intended lie. I don't remember the disk sizes back then, but when you were comparing the disk drives it could look something like:
Maxtor: 80 MB, $200
Seagate: 80 MB $205
Western Digital: 82 MB, $200
Which is the best value?
Note that it makes sense to measure RAM and SSD in base 2 units because that's the way the memory is addressed and the memory modules produced. It is very likely that the RAM will be measured in base 2 units in the future.
There are tons of non-standard measures everywhere. To begin with the inch and the mile are non-standard measures. Fahrenheit is not standard. Minutes and hours are standard but not decimal. Horse power is the most stupid unit ever but is more widely used in car industry than the Watt. Electron volt (eV) is widely used by nuclear physics because it is much more convenient for calculations and human perception than joules. Base 2 was the norm for computer memory before some asshole from Western Digital decided to confuse the customers. Currently, the confusion is still here and there are tons of people asking "Why my HD is supposed to be NN GB but the OS shows MM GB?".
The fact that Leopard will display the disk-related memory in decimal units is a good decision (maybe), but the confusion with RAM will remain.
Take a look at the wikipedia page.
The company to first introduce the decimal multiplier was Western Digital. The purpose was NOT to make it more convenient for the users. The purpose was to fool the customer. It was an intended lie. I don't remember the disk sizes back then, but when you were comparing the disk drives it could look something like:
Maxtor: 80 MB, $200
Seagate: 80 MB $205
Western Digital: 82 MB, $200
Which is the best value?
Note that it makes sense to measure RAM and SSD in base 2 units because that's the way the memory is addressed and the memory modules produced. It is very likely that the RAM will be measured in base 2 units in the future.
There are tons of non-standard measures everywhere. To begin with the inch and the mile are non-standard measures. Fahrenheit is not standard. Minutes and hours are standard but not decimal. Horse power is the most stupid unit ever but is more widely used in car industry than the Watt. Electron volt (eV) is widely used by nuclear physics because it is much more convenient for calculations and human perception than joules. Base 2 was the norm for computer memory before some asshole from Western Digital decided to confuse the customers. Currently, the confusion is still here and there are tons of people asking "Why my HD is supposed to be NN GB but the OS shows MM GB?".
The fact that Leopard will display the disk-related memory in decimal units is a good decision (maybe), but the confusion with RAM will remain.
Standard is what's used. There are lots of standards.
But not one in a million cares about base 2 as a measure of memory, either volatile or not. If it's measured that way in the labs that's fine. But a million is a million, and to most people, that's 1,000,000. That's all they understand. There's no reason to confuse because it makes you feel better.
But a million is a million, and to most people, that's 1,000,000. That's all they understand. There's no reason to confuse because it makes you feel better.
What is confusing is when a one million bytes is listed as .95MB, so a million is no longer being represented as a million.
Guys... There's pretty good indicator of how they measure their storage... What's measured in base 10 divides by 10 usually. No big yen to fuss about, but one can find out what Apple ought to be ashamed of.
Guys... There's pretty good indicator of how they measure their storage... What's measured in base 10 divides by 10 usually. No big yen to fuss about, but one can find out what Apple ought to be ashamed of.
Yes, we know, that doesn't mean that people aren't confused by the variances in kilo/mega/giga/etc. for the same device. At least with Apple that will no longer be much of an issue. We'll see if other OSes follow suit.
Yes, we know, that doesn't mean that people aren't confused by the variances in kilo/mega/giga/etc. for the same device. At least with Apple that will no longer be much of an issue. We'll see if other OSes follow suit.
Good.
Good.
It is good, but I bet people will think Apple is being sleazy with the use of the SI standard over the JEDEC standard.