The author decidedly has expert syndrome -- they deny both the history and rational behind memory units nomenclature. Memory measurements evolved utilizing binary organizational patterns used in computing architectures. While a proud French pedant might agree with the decimal normalization of memory units discussed, it aligns more closely to the metric system, and it may have benefits for laypeople, it fails to account for how memory is partitioned in historic and modern computing.
show comments
f33d5173
A mile is exactly 1000 paces, or 4000 feet. You may disagree, but consider: the word mile come from latin for "one thousand". Therefore a mile must be 1000 of something, namely paces. I hope you find this argument convincing.
show comments
kmm
And a megabyte is depending on the context precisely 1000x1000=1,000,000 or 1024x1024=1,048,576 bytes*, except when you're talking about the classic 3.5 inch floppy disks, where "1.44 MB" stands for 1440x1024 bytes, or about 1.47 true MB or 1.41 MiB.
* Yeah, I read the article. Regardless of the IEC's noble attempt, in all my years of working with people and computers I've never heard anyone actually pronounce MiB (or write it out in full) as "mebibyte".
show comments
pjdesno
I had a computer architecture prof (a reasonably accomplished one, too) who thought that all CS units should be binary, e.g. Gigabit Ethernet should be 931Mbit/s, not 1000MBit/s.
I disagreed strongly - I think X-per-second should be decimal, to correspond to Hertz. But for quantity, binary seems better. (modern CS papers tend to use MiB, GiB etc. as abbreviations for the binary units)
Fun fact - for a long time consumer SSDs had roughly 7.37% over-provisioning, because that's what you get when you put X GB (binary) of raw flash into a box, and advertise it as X GB (decimal) of usable storage. (probably a bit less, as a few blocks of the X binary GB of flash would probably be DOA) With TLC, QLC, and SLC-mode caching in modern drives the numbers aren't as simple anymore, though.
show comments
mrb
Whenever this discussion comes up I liked to point out that even in the computer industry, prefixes like kilo/mega/etc more often mean a power of 10 than a power of 2:
The mistake was using the "Kibi" prefix. "Kibibyte" just sounds a bit silly when said out loud.
show comments
ineedasername
> Why does 1000 still make more sense?
The author doesn’t actually answer their question, unless I missed something?
They go on to make a few more observations, and say finally only that the current different definitions are sometimes confusing, to non experts.
I don’t see much of an argument here for changing anything. Some non experts experience minor confusion about two things that are different, did I miss something bigger in this?
show comments
zetanor
To clear up any confusion, let's compromise at a 1012-byte kilobyte.
> Explorer is just following existing practice. Everybody (to within experimental error) refers to 1024 bytes as a kilobyte, not a kibibyte. If Explorer were to switch to the term kibibyte, it would merely be showing users information in a form they cannot understand, and for what purpose? So you can feel superior because you know what that term means and other people don’t.
show comments
kstrauser
I'm sticking with power-of-2 sizes. Invent a new word for decimal, metric units where appropriate. I proposed[0] "kitribytes", "metribytes", "gitribytes", etc. Just because "kilo" has a meaning in one context doesn't mean we're stuck with it in others. It's not as though the ancient Greeks originally meant "kilo" to mean "exactly 1,000". "Giga" just meant "giant". "Tera" is just "monster". SI doesn't have sole ownership for words meaning "much bigger than we can possibly count at a glance".
Donald Knuth himself said[1]:
> The members of those committees deserve credit for raising an important issue, but when I heard their proposal it seemed dead on arrival --- who would voluntarily want to use MiB for a maybe-byte?! So I came up with the suggestion above, and mentioned it on page 94 of my Introduction to MMIX. Now to my astonishment, I learn that the committee proposals have actually become an international standard. Still, I am extremely reluctant to adopt such funny-sounding terms; Jeffrey Harrow says "we're going to have to learn to love (and pronounce)" the new coinages, but he seems to assume that standards are automatically adopted just because they are there.
If Gordon Bell and Gene Amdahl used binary sizes -- and they did -- and Knuth thinks the new terms from the pre-existing units sound funny -- and they do -- then I feel like I'm in good company on this one.
Ah, if only I had a dollar for every time I've had to point someone to the a tool like the following when trying to explain the difference between how much "bandwidth" their server has per month (an IEC unit) vs how fast the server connection is (a SI unit): https://null.53bits.co.uk/uploads/programming/javascript/dat...
nerdsniper
Final edit:
This ambiguity is documented at least back to 1984, by IBM, the pre-eminent computer company of the time.
In 1972 IBM started selling the IBM 3333 magnetic disk drive. This product catalog [0] from 1979 shows them marketing the corresponding disks as "100 million bytes" or "200 million bytes" (3336 mdl 1 and 3336 mdl 11, respectively). By 1984, those same disks were marketed in the "IBM Input/Output Device Summary"[1] (which was intended for a customer audience) as "100MB" and "200MB"
Edit: The below is wrong. Older experience has corrected me - there has always been ambiguity (perhaps bifurcated between CPU/OS and storage domains). "And that with such great confidence!", indeed.
-------
The article presents wishful thinking. The wish is for "kilobyte" to have one meaning. For the majority of its existence, it had only one meaning - 1024 bytes. Now it has an ambiguous meaning. People wish for an unambiguous term for 1000 bits, however that word does not exist. People also might wish that others use kibibyte any time they reference 1024 bytes, but that is also wishful thinking.
The author's wishful thinking is falsely presented as fact.
I think kilobyte was the wrong word to ever use for 1024 bytes, and I'd love to go back in time to tell computer scientists that they needed to invent a new prefix to mean "1,024" / "2^10" of something, which kilo- never meant before kilobit / kilobyte were invented. Kibi- is fine, the phonetics sound slightly silly to native English speakers, but the 'bi' indicates binary and I think that's reasonable.
I'm just not going to fool myself with wishful thinking. If, in arrogance or self-righteousness, one simply assumes that every time they see "kilobyte" it means 1,000 bytes - then they will make many, many failures. We will always have to take care to verify whether "kilobyte" means 1,000 or 1,024 bytes before implementing something which relies on that for correctness.
show comments
recursive
I'm suprised they didn't mention kibibyte. (Edit: they did) There are plenty of applications where power-of-2 alignment are useful or necessary. Not addressing that and just chastising everyone for using units wrong isn't particularly helpful. I guess we can just all switch to kibibytes, except the HDD manufacturers.
show comments
encomiast
I've tried this approach with Lowes when I buy 2x4s. About as effective.
Out_of_Characte
The author doesn't go far enough into the problems with trying to convert information theory to SI Units.
SI units are attempting to fix standard measurements with perceived constants in nature. A meter(Distance) is the distance light travels in a vacuum, back and forth, within a certain amount of ossilations of a cesium atom(Time). This doesn't mean we tweak the meter to conform to observational results as we'd all be happier if light really was 300 000KM/s instead of ~299 792km/s.
Then there's the problem of not mixing different measurement units. SI was designed to conform all measurements to the same base 10 exponents (cm, m, km versus feet inches and yards) But the authors attempt to resolve this matter doesn't even conform to standardised SI units as we would expect them to.
What is a byte? Well, 8 bits, sometimes.
What is a kilobit? 1000 Bits
What is a kilobyte? 1000 Bytes, or 1024 Bytes.
Now we've already mixed units based on what a bit or a byte even is and the addition of the 8 multiplier in addition to the exponent of 1000 or 1024.
And if you think, hey, at least the bit is the least divisible unit of information, That's not even correct. If there Should* be a reformalisation of information units, you would agree that the amount of "0"'s is the least divisible unit of information. A kilo of zero's, would be 1000. A 'byte' would be defined as containing up to 256 zero's. A Megazero would contain up to a million zero's.
It wouldn't make any intuitive sense for anyone to count 0's, which would automatically convert your information back to base 10, but it does prove that the most sensible unit of information is already what we've had before, that is, you're not mixing bytes (powers of 2) with SI-defined units of 1000
lr1970
<joke> How to tell a software engineer from a real one? A real engineer thinks that 1 kilobyte is 1000 bytes while software engineer believes that there are 1024 meters in a kilometer :-) </joke>
show comments
pif
For all the people commenting as if the meaning of "kilo" was open to discussion... you are all from the United States of America, and you call your country "America", right?
jasperry
I agree in principle, but does anyone else feel super awkward saying "mebibyte" and "gibibyte"?
show comments
Taniwha
Oh sure, and next you'll say a byte is 10 bits ....
show comments
sebtron
A metric kilobyte is 1000 bytes. An imperial kilobyte, on the other hand, is 5280 bytes.
show comments
arjie
It is good that the old ways have not been forgotten. We used to argue about tabs vs. spaces, GPL vs. BSD, Linux vs. BSD, FreeBSD vs. NetBSD, BSD 2 clause vs BSD 3 clause. It's important to complain about things pointlessly. Builds character.
Anyway, here's my contribution to help make everything worse. I think we should use Kylobyte, etc. when we don't care whether it's 1000 or 1024. KyB. See! Works great.
none_to_remain
I like how the GNU coreutils seem to have done. They use real, 1024-byte kilobytes by default, but print only the abbreviation of the prefix so it's just 10K or 200M and people can pretend it stands for some other silly word if they want.
You can use `--si` for fake, 1000-byte kilobytes - trying it it seems weird that these are reported with a lowercase 'k' but 'M' and so on remain uppercase.
show comments
NooneAtAll3
I automatically assume that people that use KB=1000B want to sell me something (and provide less than promised), so should be aggressively ignored or removed from vicinities
KB is 1024 bytes, and don't you dare try stealing those 24 bytes from me
show comments
xboxnolifes
ITT: People who will vehemently talk about how everyone should convert to SI metric, except when it pertains to their personal favorite unit.
TZubiri
It's not clear whether you are asking a question, proposing a new standard, or affirming an existing convention.
jachee
The entire reason "storage vendors prefer" 1000-based kilobytes is so that they could misrepresent and over-market their storage capacities, getting that 24-bytes per-kb of expectation-vs-reality profit.
It's the same reason—for pure marketing purposes—that screens are measured diagonally.
show comments
layer8
Since 1000 is 3e8, I’ll argue that it should be 300000000 bytes.
quotemstr
It's too late. Powers-of-two won. I'm the sort of person who uses "whom" in English, but even I acknowledge that using "KB" to mean 1,000, not 1,024, can only breed confusion. The purpose of language is to communicate. I'm all for pedantry when it's compatible with clarity, but we can't reconcile the two goals here.
show comments
nayuki
> 1 kilobyte is precisely 1000 bytes
Agreed. For the naysayers out there, consider these problems:
* You have 1 "MB" of RAM on a 1 MHz system bus which can transfer 1 byte per clock cycle. How many seconds does it take to read the entire memory?
* You have 128 "GB" of RAM and you have an empty 128 GB SSD. Can you successfully hibernate the computer system by storing all of RAM on the SSD?
* My camera shoots 6000×4000 pixels = exactly 24 megapixels. If you assume RGB24 color (3 bytes per pixel), how many MB of RAM or disk space does it take to store one raw bitmap image matrix without headers?
The SI definitions are correct: kilo- always means a thousand, mega- always means a million, et cetera. The computer industry abused these definitions because 1000 is close to 1024, creating endless confusion. It is a idiotic act of self-harm when one "megahertz" of clock speed is not the same mega- as one "megabyte" of RAM. IEC 60027 prefixes are correct: there is no ambiguity when kibi- (Ki) is defined as 1024, and it can coexist beside kilo- meaning 1000.
The whole point of the metric system is to create universal units whose meanings don't change depending on context. Having kilo- be overloaded (like method overloading) to mean 1000 and 1024 violates this principle.
If you want to wade in the bad old world of context-dependent units, look no further than traditional measures. International mile or nautical mile? Pound avoirdupois or Troy pound? Pound-force or pound-mass? US gallon or UK gallon? US shoe size for children, women, or men? Short ton or long ton? Did you know that just a few centuries ago, every town had a different definition of a foot and pound, making trade needlessly complicated and inviting open scams and frauds?
show comments
talles
I refuse to say "kibibyte" out loud
stalfosknight
No, it's not. A kilobyte is 1,024 bytes.
jwlake
I remember when they invented kibibytes and mibibytes and shaking my head and being like they have forever destroyed the meaning of words and things will be off by 2% forever. And is has been.
self_awareness
I propose we use footbyte, milebyte, inchbyte.
astrobe_
... And a hacker is precisely a cyber-criminal.
zephen
Nope.
It would be nice to have a different standard for decimal vs. binary kilobytes.
But if Don Knuth thinks that the "international standard" naming for binary kilobytes is dead on arrival, who am I to argue?
Just to show that disinformation exists in every field.
jijijijij
Metric prefixing should only be used with the unit bit. There is no confusion there. I mean, if you would equate a bit with a certain voltage threshold, you could even argue about fractional bits.
Approximating metric prefixing with kibi, Mibi, Gibi... is confusing because it doesn't make sense semantically. There is nothing base-10-ish about it.
* 2^10, the kibibyte, is a deci (shifted) byte, or just a 'deci'
* 2^20, the mibibyte, is a vici (shifted) byte, or a 'vici'
* 2^30, the gibibyte, is a trici (shifted) byte, or a 'trici'
I mean, we really only need to think in bytes for memory addressing, right? The base doesn't matter much, if we were talking exabytes, does it?
mc32
One thing that annoys me is:
Why don’t kilobyte continue to mean 1024 and introduce kilodebyte to mean 1000. Byte, to me implies a binary number system, and if you want to introduce a new nomenclature to reduce confusion, give the new one a new name and let the older of more prevalent one in its domain keep the old one…
The author decidedly has expert syndrome -- they deny both the history and rational behind memory units nomenclature. Memory measurements evolved utilizing binary organizational patterns used in computing architectures. While a proud French pedant might agree with the decimal normalization of memory units discussed, it aligns more closely to the metric system, and it may have benefits for laypeople, it fails to account for how memory is partitioned in historic and modern computing.
A mile is exactly 1000 paces, or 4000 feet. You may disagree, but consider: the word mile come from latin for "one thousand". Therefore a mile must be 1000 of something, namely paces. I hope you find this argument convincing.
And a megabyte is depending on the context precisely 1000x1000=1,000,000 or 1024x1024=1,048,576 bytes*, except when you're talking about the classic 3.5 inch floppy disks, where "1.44 MB" stands for 1440x1024 bytes, or about 1.47 true MB or 1.41 MiB.
* Yeah, I read the article. Regardless of the IEC's noble attempt, in all my years of working with people and computers I've never heard anyone actually pronounce MiB (or write it out in full) as "mebibyte".
I had a computer architecture prof (a reasonably accomplished one, too) who thought that all CS units should be binary, e.g. Gigabit Ethernet should be 931Mbit/s, not 1000MBit/s.
I disagreed strongly - I think X-per-second should be decimal, to correspond to Hertz. But for quantity, binary seems better. (modern CS papers tend to use MiB, GiB etc. as abbreviations for the binary units)
Fun fact - for a long time consumer SSDs had roughly 7.37% over-provisioning, because that's what you get when you put X GB (binary) of raw flash into a box, and advertise it as X GB (decimal) of usable storage. (probably a bit less, as a few blocks of the X binary GB of flash would probably be DOA) With TLC, QLC, and SLC-mode caching in modern drives the numbers aren't as simple anymore, though.
Whenever this discussion comes up I liked to point out that even in the computer industry, prefixes like kilo/mega/etc more often mean a power of 10 than a power of 2:
I gave some examples in my post https://blog.zorinaq.com/decimal-prefixes-are-more-common-th...
The mistake was using the "Kibi" prefix. "Kibibyte" just sounds a bit silly when said out loud.
> Why does 1000 still make more sense?
The author doesn’t actually answer their question, unless I missed something?
They go on to make a few more observations, and say finally only that the current different definitions are sometimes confusing, to non experts.
I don’t see much of an argument here for changing anything. Some non experts experience minor confusion about two things that are different, did I miss something bigger in this?
To clear up any confusion, let's compromise at a 1012-byte kilobyte.
>Why do we often say 1 kilobyte = 1024 bytes?
Because Windows, and only Windows, shows it this way. It is official and documented: https://devblogs.microsoft.com/oldnewthing/20090611-00/?p=17...
> Explorer is just following existing practice. Everybody (to within experimental error) refers to 1024 bytes as a kilobyte, not a kibibyte. If Explorer were to switch to the term kibibyte, it would merely be showing users information in a form they cannot understand, and for what purpose? So you can feel superior because you know what that term means and other people don’t.
I'm sticking with power-of-2 sizes. Invent a new word for decimal, metric units where appropriate. I proposed[0] "kitribytes", "metribytes", "gitribytes", etc. Just because "kilo" has a meaning in one context doesn't mean we're stuck with it in others. It's not as though the ancient Greeks originally meant "kilo" to mean "exactly 1,000". "Giga" just meant "giant". "Tera" is just "monster". SI doesn't have sole ownership for words meaning "much bigger than we can possibly count at a glance".
Donald Knuth himself said[1]:
> The members of those committees deserve credit for raising an important issue, but when I heard their proposal it seemed dead on arrival --- who would voluntarily want to use MiB for a maybe-byte?! So I came up with the suggestion above, and mentioned it on page 94 of my Introduction to MMIX. Now to my astonishment, I learn that the committee proposals have actually become an international standard. Still, I am extremely reluctant to adopt such funny-sounding terms; Jeffrey Harrow says "we're going to have to learn to love (and pronounce)" the new coinages, but he seems to assume that standards are automatically adopted just because they are there.
If Gordon Bell and Gene Amdahl used binary sizes -- and they did -- and Knuth thinks the new terms from the pre-existing units sound funny -- and they do -- then I feel like I'm in good company on this one.
0: https://honeypot.net/2017/06/11/introducing-metric-quantity....
1: https://www-cs-faculty.stanford.edu/~knuth/news99.html
Ah, if only I had a dollar for every time I've had to point someone to the a tool like the following when trying to explain the difference between how much "bandwidth" their server has per month (an IEC unit) vs how fast the server connection is (a SI unit): https://null.53bits.co.uk/uploads/programming/javascript/dat...
Final edit:
This ambiguity is documented at least back to 1984, by IBM, the pre-eminent computer company of the time.
In 1972 IBM started selling the IBM 3333 magnetic disk drive. This product catalog [0] from 1979 shows them marketing the corresponding disks as "100 million bytes" or "200 million bytes" (3336 mdl 1 and 3336 mdl 11, respectively). By 1984, those same disks were marketed in the "IBM Input/Output Device Summary"[1] (which was intended for a customer audience) as "100MB" and "200MB"
0: (PDF page 281) "IBM 3330 DISK STORAGE" http://electronicsandbooks.com/edt/manual/Hardware/I/IBM%20w...
1: (PDF page 38, labeled page 2-7, Fig 2-4) http://electronicsandbooks.com/edt/manual/Hardware/I/IBM%20w...
Also, hats off to http://electronicsandbooks.com/ for keeping such incredible records available for the internet to browse.
-------
Edit: The below is wrong. Older experience has corrected me - there has always been ambiguity (perhaps bifurcated between CPU/OS and storage domains). "And that with such great confidence!", indeed.
-------
The article presents wishful thinking. The wish is for "kilobyte" to have one meaning. For the majority of its existence, it had only one meaning - 1024 bytes. Now it has an ambiguous meaning. People wish for an unambiguous term for 1000 bits, however that word does not exist. People also might wish that others use kibibyte any time they reference 1024 bytes, but that is also wishful thinking.
The author's wishful thinking is falsely presented as fact.
I think kilobyte was the wrong word to ever use for 1024 bytes, and I'd love to go back in time to tell computer scientists that they needed to invent a new prefix to mean "1,024" / "2^10" of something, which kilo- never meant before kilobit / kilobyte were invented. Kibi- is fine, the phonetics sound slightly silly to native English speakers, but the 'bi' indicates binary and I think that's reasonable.
I'm just not going to fool myself with wishful thinking. If, in arrogance or self-righteousness, one simply assumes that every time they see "kilobyte" it means 1,000 bytes - then they will make many, many failures. We will always have to take care to verify whether "kilobyte" means 1,000 or 1,024 bytes before implementing something which relies on that for correctness.
I'm suprised they didn't mention kibibyte. (Edit: they did) There are plenty of applications where power-of-2 alignment are useful or necessary. Not addressing that and just chastising everyone for using units wrong isn't particularly helpful. I guess we can just all switch to kibibytes, except the HDD manufacturers.
I've tried this approach with Lowes when I buy 2x4s. About as effective.
The author doesn't go far enough into the problems with trying to convert information theory to SI Units.
SI units are attempting to fix standard measurements with perceived constants in nature. A meter(Distance) is the distance light travels in a vacuum, back and forth, within a certain amount of ossilations of a cesium atom(Time). This doesn't mean we tweak the meter to conform to observational results as we'd all be happier if light really was 300 000KM/s instead of ~299 792km/s.
Then there's the problem of not mixing different measurement units. SI was designed to conform all measurements to the same base 10 exponents (cm, m, km versus feet inches and yards) But the authors attempt to resolve this matter doesn't even conform to standardised SI units as we would expect them to.
What is a byte? Well, 8 bits, sometimes. What is a kilobit? 1000 Bits What is a kilobyte? 1000 Bytes, or 1024 Bytes.
Now we've already mixed units based on what a bit or a byte even is and the addition of the 8 multiplier in addition to the exponent of 1000 or 1024.
And if you think, hey, at least the bit is the least divisible unit of information, That's not even correct. If there Should* be a reformalisation of information units, you would agree that the amount of "0"'s is the least divisible unit of information. A kilo of zero's, would be 1000. A 'byte' would be defined as containing up to 256 zero's. A Megazero would contain up to a million zero's.
It wouldn't make any intuitive sense for anyone to count 0's, which would automatically convert your information back to base 10, but it does prove that the most sensible unit of information is already what we've had before, that is, you're not mixing bytes (powers of 2) with SI-defined units of 1000
<joke> How to tell a software engineer from a real one? A real engineer thinks that 1 kilobyte is 1000 bytes while software engineer believes that there are 1024 meters in a kilometer :-) </joke>
For all the people commenting as if the meaning of "kilo" was open to discussion... you are all from the United States of America, and you call your country "America", right?
I agree in principle, but does anyone else feel super awkward saying "mebibyte" and "gibibyte"?
Oh sure, and next you'll say a byte is 10 bits ....
A metric kilobyte is 1000 bytes. An imperial kilobyte, on the other hand, is 5280 bytes.
It is good that the old ways have not been forgotten. We used to argue about tabs vs. spaces, GPL vs. BSD, Linux vs. BSD, FreeBSD vs. NetBSD, BSD 2 clause vs BSD 3 clause. It's important to complain about things pointlessly. Builds character.
Anyway, here's my contribution to help make everything worse. I think we should use Kylobyte, etc. when we don't care whether it's 1000 or 1024. KyB. See! Works great.
I like how the GNU coreutils seem to have done. They use real, 1024-byte kilobytes by default, but print only the abbreviation of the prefix so it's just 10K or 200M and people can pretend it stands for some other silly word if they want.
You can use `--si` for fake, 1000-byte kilobytes - trying it it seems weird that these are reported with a lowercase 'k' but 'M' and so on remain uppercase.
I automatically assume that people that use KB=1000B want to sell me something (and provide less than promised), so should be aggressively ignored or removed from vicinities
KB is 1024 bytes, and don't you dare try stealing those 24 bytes from me
ITT: People who will vehemently talk about how everyone should convert to SI metric, except when it pertains to their personal favorite unit.
It's not clear whether you are asking a question, proposing a new standard, or affirming an existing convention.
The entire reason "storage vendors prefer" 1000-based kilobytes is so that they could misrepresent and over-market their storage capacities, getting that 24-bytes per-kb of expectation-vs-reality profit.
It's the same reason—for pure marketing purposes—that screens are measured diagonally.
Since 1000 is 3e8, I’ll argue that it should be 300000000 bytes.
It's too late. Powers-of-two won. I'm the sort of person who uses "whom" in English, but even I acknowledge that using "KB" to mean 1,000, not 1,024, can only breed confusion. The purpose of language is to communicate. I'm all for pedantry when it's compatible with clarity, but we can't reconcile the two goals here.
> 1 kilobyte is precisely 1000 bytes
Agreed. For the naysayers out there, consider these problems:
* You have 1 "MB" of RAM on a 1 MHz system bus which can transfer 1 byte per clock cycle. How many seconds does it take to read the entire memory?
* You have 128 "GB" of RAM and you have an empty 128 GB SSD. Can you successfully hibernate the computer system by storing all of RAM on the SSD?
* My camera shoots 6000×4000 pixels = exactly 24 megapixels. If you assume RGB24 color (3 bytes per pixel), how many MB of RAM or disk space does it take to store one raw bitmap image matrix without headers?
The SI definitions are correct: kilo- always means a thousand, mega- always means a million, et cetera. The computer industry abused these definitions because 1000 is close to 1024, creating endless confusion. It is a idiotic act of self-harm when one "megahertz" of clock speed is not the same mega- as one "megabyte" of RAM. IEC 60027 prefixes are correct: there is no ambiguity when kibi- (Ki) is defined as 1024, and it can coexist beside kilo- meaning 1000.
The whole point of the metric system is to create universal units whose meanings don't change depending on context. Having kilo- be overloaded (like method overloading) to mean 1000 and 1024 violates this principle.
If you want to wade in the bad old world of context-dependent units, look no further than traditional measures. International mile or nautical mile? Pound avoirdupois or Troy pound? Pound-force or pound-mass? US gallon or UK gallon? US shoe size for children, women, or men? Short ton or long ton? Did you know that just a few centuries ago, every town had a different definition of a foot and pound, making trade needlessly complicated and inviting open scams and frauds?
I refuse to say "kibibyte" out loud
No, it's not. A kilobyte is 1,024 bytes.
I remember when they invented kibibytes and mibibytes and shaking my head and being like they have forever destroyed the meaning of words and things will be off by 2% forever. And is has been.
I propose we use footbyte, milebyte, inchbyte.
... And a hacker is precisely a cyber-criminal.
Nope.
It would be nice to have a different standard for decimal vs. binary kilobytes.
But if Don Knuth thinks that the "international standard" naming for binary kilobytes is dead on arrival, who am I to argue?
https://www-cs-faculty.stanford.edu/~knuth/news99.html
Just to show that disinformation exists in every field.
Metric prefixing should only be used with the unit bit. There is no confusion there. I mean, if you would equate a bit with a certain voltage threshold, you could even argue about fractional bits.
Approximating metric prefixing with kibi, Mibi, Gibi... is confusing because it doesn't make sense semantically. There is nothing base-10-ish about it.
I propose some naming based on shift distance, derived from the latin iterativum. https://en.wikipedia.org/wiki/Latin_numerals#Adverbial_numer...
* 2^10, the kibibyte, is a deci (shifted) byte, or just a 'deci'
* 2^20, the mibibyte, is a vici (shifted) byte, or a 'vici'
* 2^30, the gibibyte, is a trici (shifted) byte, or a 'trici'
I mean, we really only need to think in bytes for memory addressing, right? The base doesn't matter much, if we were talking exabytes, does it?
One thing that annoys me is:
Why don’t kilobyte continue to mean 1024 and introduce kilodebyte to mean 1000. Byte, to me implies a binary number system, and if you want to introduce a new nomenclature to reduce confusion, give the new one a new name and let the older of more prevalent one in its domain keep the old one…