What is the origin of K = 1024? [closed]

It goes back quite some time, and is detailed here. It looks like you can blame IBM, if anybody.

Having thought about it some more, I would blame the Americans as a whole, for their blatant disregard for the Système international d'unités :P


All computing was low-level at the beginning. And at low level programming the number "1000" is totally useless and they needed prefixes for larger amounts so they reused the SI ones. Everyone knew it in the field, there was no confusion. It served well for 30 years or who knows.

It's not because they were Americans so they needed to break SI at all costs. :-)

There is no programmer who I know and says kibibyte. They say kilobyte and they mean 1024 bytes. Algorithms are full of the powers of 2. Even today, "1000" is a really useless number between programmers.

Saying kibi and mibi is just too funny and draws attention from the subject. We happily give it away to the telecommunication and disk storage sectors :-). And I will write kibibytes on user interfaces where non-programmers may read it.


It is correct and makes sense for technical people to use 1024 = 1K in many cases.

For end users it is normally better to say 1000 = 1k because everybody is used to the 10-based number system.

The problem is where to draw the line. Sometimes marketing or advertising people do not really succeed in the "translation" or in adapting technical data and language to end users.


Blame semiconductor manufacturers (they provide us with binary hardware only)[1]

Better yet: blame logic itself (binary logic is just the most elementary logic).

Better yet: who shall we blame for the wretched decimal system?

It has far more flaws than the binary system. It was based cough on the average number of fingers in the human species cough

Oooo...

[1] I want my quantum three-qubit computer!!! Now!