Alternatives to the 0 and 1 bit style/structure

I've looked everywhere for this answer or even at least a question like this one (even tom's hardware didn't have anything 'explicitly' related to this).

My question is simple:

Is there or are there any alternatives to the current way data is processed (using 0s and 1s) in computer architecture?

I came across this question when looking for a new PC to buy and got into looking at how Intel and the other processor guys spend billions squeezing more transistors onto chips, etc. (but that is only partly related to my question).

Some people may say that "0s and 1s are the lowest form of representing data", which was true back when such computers started using such a system. Is it still the case today? Have we really not gone right back to the drawing board to look at alternatives for processing that can likely shrink the processing needs we currently face?

I know to some of you that this question may have a simple answer that you think is correct, but just thinking about it and going all the way back to 0s and 1s and even the transistor itself, it makes you wonder whether alternatives to every single method or step of the architecture exists out there (not just the 0 and 1 representation).

My personal opinion not related to the question "I believe that because of the complex nature current PCs have, the capacity to do something more complex than 0|1 processing at the lowest level is something that may be possible today, simply because that type of processing seems like it defeats the purpose/s of complex solving the PC was designed for"


Solution 1:

The 0/1 structure is indeed the simplest way to represent and store data. But remember that before digital technology (for storage) was introduced, devices used analog storage solutions. Also remember that quantum computing is currently being researched & implemented (but at a very early stage), and it is other kind of data representation and processing.


Referring to everyday computing in the present, note that 0/1 architecture (or true/false, on/off, etc) is mandatory because the current technology relays on digital (2-state) streams. If you try to make stuff more complex on the most basic level, it will eventually render the system harder to maintain and understand how it works. I'm not saying that it is not possible - as I said the "next big thing" on this is approaching us, but it has to be done very carefully to not mess it up. Trying to make things more complex for no reason is not a good idea. But, my previous example, quantum computing, is an exception because it's a new area of science to explore, and in top of all - more efficient, comparing to digital technology.


In addition, the idea of ternary computer (3-state instead of 2-state technology) has been suggested, but not widely implemented for couple of reasons:

It is much harder to build components that use more than two states/levels/whatever. For example, the transistors used in logic are either closed and don't conduct at all, or wide open. Having them half open would require much more precision and use extra power. Nevertheless, sometimes more states are used for packing more data, but rarely (e.g. modern NAND flash memory, modulation in modems).

If you use more than two states you need to be compatible to binary, because the rest of the world uses it. Three is out because the conversion to binary would require expensive multiplication or division with remainder. Instead you go directly to four or a higher power of two.

These are practical reasons why it is not done, but mathematically it is perfectly possible to build a computer on ternary logic.

References / Further Reading:

Wikipedia

  • Qubit
  • Quantum Computing
  • Analog Signal
  • Digital Signal

Nature

  • Efficient quantum computing using coherent photon conversion
  • Box: Quantum computing

Other

  • Qubit, from Quantity - the quantum wiki.
  • Why binary and not ternary computing? - from StackOverflow
  • How Does a Quantum Computer Work? - YouTube
  • Also consider the references within the articles themselves.

Solution 2:

A designer knows he has achieved perfection not when there is nothing left to add, but when there is nothing left to take away. -- Antoine de Saint-Exupéry

0s and 1s are just a simpliest way of expressing numbers, and computers we know are all about numbers. Any number that can be written using digits 0-9 has its equivalent in 0s and 1s (see binary number in Wikipedia). As far as you're using a computer for calculations (and that's what we're doing right now), you don't need more than 2 digits. Actually, introducing next digits would make calculations more complex, as you'd need another layer of abstraction over the physical 0-1 architecture.

You should also be aware that 0 and 1 are logical states: false and true. Another digit wouldn't be of much use as long as we're sticking to the logic (although some people state that we need third state, file not found ;) ) Computers like the ones we're using right now don't need more than 0/1.

But. When you stop thinking in categories of logic, that's a whole different story. Quantum computers are being researched. In quantum mechanics there's just a probability that something's true or false, the real state is somewhere in between. There are very few people in the world that could say they have at least some general idea of how quantum computers work and the science behind them isn't completely understood yet. But there are few quantum computer-related ideas that were already implemented, like this one.