How to link processing power of old computers together?
I'm sitting on 8 old computers of varied sorts that are more or less useless at this point for any other purpose really. Is there a way I could link their hardware or processing power or whatever together over wifi and use one as like a central computer? Like it would be cool to distribute the processing of some video game or encryption generating program over the collective computers. Any way to do all this?
Solution 1:
Condor is excellent for cycle scavenging.
Using TORQUE/pbs is also very popular for clustered computing. This is a regular package in Debian, Ubuntu and probably many other Linux distributions.
TORQUE, pbs and the excellent scheduler Maui are well documented at Cluster Resources.
Solution 2:
There are a number of schemes for distributed computing. One is called Digipede. At a previous employer (this is in the 90s), we had a warehouse of older desktop computers that hadn't been fully depreciated (even though 75-100Mhz pentiums were worthless by then), and I wasn't allowed to order new servers for the processing my department had to do, so I got a bunch of them, refactored much of the code into DCOM objects. I called it RAIC - redundant array of inexpensive computers. Painting numbers on each of the stack of computers, it was easy to say "meh, go reboot #5." Looking back it was cool, frugal and a waste of time.
My advice would be to not bother. If you have a task that is well suited to distributed processing, you'd already have ideas. If you don't have such a task, you're going around with a hammer looking for nails. In such a case, set them up to run seti@home processing.
Solution 3:
if they have enough processing power for browsing and wordprocessing, why not install OpenOffice and put them on Freecycle for someone who can't afford to buy a computer.
Solution 4:
The short answer is, "no, not really". Not all that many common applications are designed to take full advantage of the parallelism of separate processors within the same computer, let alone across multiple computers.
Exceptions to this are the sort of parallel computing that was first popularised by seti@home, which takes a large problem of a type amenable to splitting into a large number of small workunits, and distributing them across a large number of separate computers. But even then, the separate computers are not really collaborating on the same task, just working on one of many identical tasks. Since seti@home, the WCG and other compendia of large distributed computing projects have made available many arguably more useful ways of using spare CPU cycles.
None of these make lots of little computers look like a big powerful Windows machine by parallelising the concurrent operating systems.
For all practical purposes, unless you want the large carbon footprint that arises from leaving many old computers on all the time, you're on a hiding to nothing with this idea.
Edit:
If you're really keen, there are some home-grown applications you could do as a not-insigificant type of project. Here's a related question...
I stand by my more general "don't bother" position unless you want a noisy, power-hungry, specialist project to take on just for the fun and learning of it, though.