views:

895

answers:

9

I've seen three ways of doing conversion from bytes to megabytes:

  1. megabytes=bytes/1000000
  2. megabytes=bytes/1024/1024
  3. megabytes=bytes/1024/1000

Ok, I think #3 is totally wrong but I have seen it. I think #2 is right, but I am looking for some respected authority (like W3C, ISO, NIST, etc) to clarify which megabyte is a true megabyte. Can anyone cite a source that explicitly explains how this calculation is done?

Bonus question: if #2 is a megabyte what are #1 and #3 called?

BTW: Hard drive manufacturers don't count as authorities on this one!

+31  A: 

Traditionally by megabyte we mean your second option -- 1 megabyte = 220 bytes. But it is not correct actually because mega means 1 000 000. There is a new standard name for 220 bytes, it is mebibyte (http://en.wikipedia.org/wiki/Mebibyte) and it gathers popularity.

Andrey
What? "We mean 2nd"? That does not make any sort of sense.
Matt Ball
+1 wanna just to add that "mebi" means "mega binary" :)
AraK
@Bears: Andrey is referring to #2 in the OP's terms - the 1024^2 bytes version.
Jefromi
Edited to clarify.
Cory Petosky
thank you for editting!
Andrey
I agree with all this, except the implication that it is gathering a lot of popularity. The mindshare of 1024=K and friends is effectively invulnerable. When you get people to start saying "cracker" instead of "hacker", get back to me.
T.E.D.
Not going to downvote for the heretical "mebibyte" mention, but it's taking a lot of effort.
aehiilrs
@T.E.D.: actually, the use of "hacker" to mean "cracker" is a perfect example of popular culture overriding the usage that geeks would prefer.
Michael Petrotta
it was my personal feeling that mebibyte is gathering popularity, may be i am wrong. i saw on some rather popular download site size shown in MiB and KiB.
Andrey
Well, I agree with you and I also use 10-base mega and 2-base kibi and mebi. That is the IEC standard for metric use: that is final and there is no use kicking over it.
Zan Lynx
+1  A: 

How about IEC?

John at CashCommons
+9  A: 

There's an IEC standard that distinguishes the terms, e.g. Mebibyte = 1024^2 bytes but Megabyte = 1000^2 (in order to be compatible to SI units like kilograms where k/M/... means 1000/1000000). Actually most people in the IT area will prefer Megabyte = 1024^2 and hard disk manufacturers will prefer Megabyte = 1000^2 (because hard disk sizes will sound bigger than they are).

As a matter of fact, most people are confused by the IEC standard (multiplier 1000) and the traditional meaning (multiplier 1024). In general you shouldn't make assumptions on what people mean. For example, 128 kBit/s for MP3s usually means 128000 bits because the multiplier 1000 is mostly used with the unit bits. But often people then call 2048 kBit/s equal to 2 MBit/s - confusing eh?

So as a general rule, don't trust bit/byte units at all ;)

AndiDog
+3  A: 

Here is what the standard (SI) says:

http://physics.nist.gov/Pubs/SP330/sp330.pdf#page=34

AVB
+1  A: 

Megabyte means 2^20 bytes. I know that technically that doesn't mesh with the SI units, and that some folks have come up with a new terminology to mean 2^20. None of that matters. Efforts to change the language to "clarify" things are doomed to failure.

Hard-drive manufacturers use it to mean 1,000,000 bytes, because that's what it means in SI so they figure technically they aren't lying (while actually they are). That falls under lies, damn lies, and marketing.

T.E.D.
+2  A: 

Use the computation your users will most likely expect. Do your users care to know how many actual bytes are on a disk or in memory or whatever, or do they only care about usable space? The answer to that question will tell you which calculation makes the most sense.

This isn't a precision question as much as it is a usability question. Provide the calculation that is most useful to your users.

Bryan Oakley
+1  A: 

The answer is that #1 is technically correct based on the real meaning of the Mega prefix, however (and in life there is always a however) the math for that doesn't come out so nice in base 2, which is how computers count, so #2 is what people really use.

Grant Johnson
+4  A: 

BTW: Hard drive manufacturers don't count as authorities on this one!

Oh, yes they do (and the definition they assume from the S.I. is the correct one). On a related issue, see this post on CodingHorror.

Federico Ramponi
Sorry downvoter, but their definition of Mega *IS* the correct one, no matter what programmers think. I can live with M = 2^20, fine, but that is just for historical reasons.
Federico Ramponi
+1 for the moxie of standing with the HD companies and SI standards enthusists against everyone else!
Jeffrey L Whitledge
Thanks Jeffrey :), but it's not just the HD companies: Your CPU does 2600 MHz; Your math processor does XY Mflops; Ethernet does 100 Mbit/s; Your camera does 5 Mpixel; Blu-ray's 1x speed is 36 Mbit/s... All these "M" stand for 10^6, not 2^20.
Federico Ramponi
I also gave you an upvote, because I agree with you. :-)
Jeffrey L Whitledge
-1 I'm not sure I follow this answer. Which hard drive manufacturer *made* a definition? I don't think CodingHorror produces hard drives? I'm looking for an authority and a hard drive manufacturer would have too much bias to be objective.
User1
The "authority" supporting M=10^6 is the International Bureau of Weights and Measures. As far as I know, there is no "authority", besides custom, supporting M=2^20.
Federico Ramponi
http://stackoverflow.com/questions/234075/what-is-your-best-programmer-joke/237814#237814
Federico Ramponi
A: 

If you have something that technically needs to be a power of two, for example the size of RAM modules, then it is 2^20.

For hard disks this is not the case, so marketing took over and used 10^6 because it makes their disks look bigger.

starblue