Why will Ubuntu no longer measure file size unit as byte, megabyte, gigabyte, etc?

Solution 1:

Short answer is yes, the prefixes change. But it doesn't really make a difference.

Reasoning

There has always been confusion because decimal-style units like KB, MB, GB were used with binary data - KB meant 1024 bytes, not 1000 bytes as might be expected. And of course many people throughout the world use the actual decimal prefixes in their daily lives under the metric system.

Network engineers and long-time computer users of course are trained to understand the difference, but the ongoing confusion meant applications were inconsistent in their usage; one application might use MB to mean 1,000,000 bytes (using the decimal prefix), while another might mean 1,048,576 bytes (using the binary interpretation).

This led to Ubuntu eventually adopting a new units policy.

Impact

The impact is really just a display issue. File sizes and network bandwidth will be displayed using the decimal prefixes, so a 5kB file will actually be 5000 bytes. This is actually in line with what many (most?) people expect.

Memory usage and some low-level utilities will display sizes using the binary prefixes (KiB, MiB, GiB, TiB). This may cause some initial confusion but is actually better than the status quo where we have one prefix meaning two different things.

Since Windows still uses the old, ad-hoc system a Wine application might display slightly different file sizes for the same file. However I at least often see different sizes displayed anyway due to rounding methods, so I'm not convinced it's a major issue.

See also:

  • What file size units do applications on Ubuntu use?

Solution 2:

IT IS SOOOO... SIMPLE!!!

A few years ago there was very little confusion about this. Because the notation

  • 1 KB = 1024 bytes
  • 1 MB = 1024 KB

was taught, learned and used in all universities and almost all the industry (software and hardware) around the world, during many years.

The stupid idea of counting in base 1000 (not even base 10) is only another symptom of the stupidity of our times and modern life.

What makes things much much worse is the more stupid idea of trying to establish (and continue to do it) the old notation for the unpractical 1000-base units. THAT HAS CREATED ALL THE CONFUSION. If they had only adopted the convention that

  • 1 KiB = 1000 bytes
  • 1 MiB = 1000 bytes

then there would be much less confusion and the problem would be much smaller.

They should have tried to establish that

1KB = 1024 bytes
1MB = 1024 KB

and

1 Ikb or ikb or Kib = 1000 bytes
1 IMb or imb or Mib  = 10^6 bytes

There is absolutely no need to use base-1000 units. Probably the idea started in a stubborn mind that said "oh, no, if kilo is 1000 and mega is 1,000,000, we are going to use kilo and mega in base-1000 for information units (base 2!)". All that just because one day, but that was much longer time ago, someone had the unfortunate idea (not so bad, though) of calling kilobytes (kb) a bunch of 1024 bytes. If he had chosen k2b and m2b, and call them kitwo bytes and mitwo bytes (or kookie bytes, mookie bytes and gookie bytes), for example, all this retarded idea of using base-1000 for all the applications and a whole operating system, and imposing it as the normal way of talking about measures in HW and SW to the peolpe, wouldn't be happening, which makes things much worse.