Is it practically possible to shorten computer Bits?

But of course, all computers can read is 1s (on) and 0s (off).

Is there a way to use x's and y's for storage / something?

All of this is my original idea.

and comment

I just thought it would be a good idea for other things such as storage, etc

Well, it is, but it is not an original idea at all.

This is basically what SSD manufacturers did when they transitioned from SLC (single level cell) capable of storing 0 or 1 to MLC (multi level cell) capable of storing 0, 1, x, y in each NAND call, then to TLC capable of storing 8 different values and now even QLC capable of storing 16 different values in each location.

Before SSDs, this concept of a non-binary "signal constellation" was widely used in modems and radio.

The concept in information theory is named "channel coding", and was already undergoing theoretical analysis back in 1928 by Hartley, then Claude Shannon predicted exactly how much information you could transmit in a given frequency range using multi-level signals.


If instead you reduce the number of bits needed to store a message while using the same two binary symbols (0 and 1 typically although their names don't matter), then you are dealing with compression, which is also a very mature field in information theory. But your question clearly introduced new symbols.