128 MByte vs. 128 GByte
I recently saw a comparison for 2005 vs. 2014 microSD card: in 2005 there was only max 128 MByte microSD cards, and in 2014, 128 GByte.
My question: I'm not 100% sure. Is 128 GByte 1000× bigger than 128 MByte or 1024× bigger?
Kilobyte, megabyte and gigabyte mean different things depending on whether the international standard that one uses is based on powers of 2 (binary) or of 10 (decimal).
There are three standards involved :
International System of Units (SI)
The modern form of the metric system and the world's most widely used system of measurement, used in both everyday commerce and science.
JEDEC
The specifications for semiconductor memory circuits and similar storage devices promulgated by the Joint Electron Device Engineering Council (JEDEC) Solid State Technology Association, a semiconductor trade and engineering standardization organization.
International Electrotechnical Commission (IEC)
International standards organization that prepares and publishes International Standards for all electrical, electronic and related technologies.
Depending of which industry you are in, and whether you are using Microsoft, the definitions may vary. For example, gigabyte stands "mostly" for 109 bytes (GB). Many computer people use this term for 10243, while others would use for it the term gibibyte (GiB), while still others would write GiB and call it a gigabyte.
The confusion is even greater for kilobyte, which may stand for both 1000 and 1024!
Some would say that a megabyte is 10002, and that 10242 should be called mebibyte,
others would disagree.
The wikipedia article Gigabyte describes how these terms were introduced into the international standards and furnishes the following table :
In 1998 the International Electrotechnical Commission (IEC) published standards for binary prefixes and requiring the use of gigabyte to strictly denote 10003 bytes and gibibyte to denote 10243 bytes. By the end of 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST, and in 2009 they were incorporated in the International System of Quantities.
In everyday life, programmers usually use megabyte and gigabyte as binary base 2, which is also the case with Microsoft Windows. Disk makers and other companies than Microsoft usually use decimal base 10. This is the reason that Windows reports the capacity of a new disk as smaller than what is written on the box.
Conclusion: A gigabyte is both 1000 times and 1024 times larger than a megabyte. It depends on which international standard you choose to use at the moment. Strictly speaking, the notation that makes the units clearer is :
GB = 1000 x MB
GiB = 1024 x MiB
(but not everyone would agree.)
References :
wikipedia Binary prefix
International System of Units (SI) - Prefixes for binary multiples
units(7) - Linux manual page
Western Digital Settles Capacity Suit (this confusion even caused a law-suit!)
SD card specs are governed by the SD Association (SDA) beginning in August 1999, which means SDA standards were in place for both time periods in your question.
SDA capacity standards dictate what file system to use when determining capacity, speed, class, etc (among other things such as physical size specs).
Assuming we're talking about microSD - SDHC standard, these are determined on FAT32 file systems. (side note - FAT32 hence the maximum file size of 32GB on this class of card). These sizes are determined in base 2 and should refer to MiB and GiB, not MB and GB.
This indicates that, per the SDA's specs, the capacities are determined in decimal rather than binary by using GB (decimal) rather than GiB (binary) in the documentation. The difference between decimal and binary can be seen in this table and shows:
1MB = 10002 bytes
1GB = 10003 bytes
128 MB = 128 x 10002 and
128 GB = 128 x 10003
You can see 128 x 10002 x 1000 = 128 x 10003.
128 GB is 1000 times larger than 128 MB
It's likely the SDA adopted decimal capacity standards based on average consumer understanding.
It is 1000 times bigger. For verification you can use a unit based calculator like Frink to perform the calculation.
Although Google disagrees and returns 1024
So there is a disagreement between those two sources, so we can drop down to the math.
1 GByte in bytes according to Google is 1073741824.1 megabyte in bytes according to Google is 1048576. Which is why they are replying with 1024.
Frink takes a different approach with 1000000000 and 1000000 respectively.
For a discussion on the history of the split between 1000 (10^3) and 1024 (2^10) you can see Wikepedia which states:
In 1998 the International Electrotechnical Commission (IEC) enacted standards for binary prefixes, specifying the use of kilobyte to strictly denote 1000 bytes and kibibyte to denote 1024 bytes. By 2007, the IEC Standard had been adopted by the IEEE, EU, and NIST and is now part of the International System of Quantities. Nevertheless, the term kilobyte continues to be widely used with both of the following two meanings:
At the start of this answer I said it was 1000 times bigger. The reason that I did that is, that from a practical standpoint, because it is the more conservative of the two and it is less likely to be found to be wrong. For instance if you can store X number of files on the smaller microSD card given all of the possible combinations of interpretation you should safely be able to store 1000 times X files on the larger microSD card.