Why does hardware get slower with time?
Solution 1:
Sometimes it IS the hardware, especially with laptops. Modern processors have circuitry to protect them from overheating, and will deliberately reduce the CPU speed if the core temperature gets too hot (or also to save power when demand is low and you're running on batteries - Intel calls the feature "SpeedStep" on their processors). If you notice your fan running all the time or the machine getting excessively hot around the cooling fan outlet, your computer's "airways" may have become clogged with dust.
I had a Dell Latitude that ran like new after I opened it up and removed about a quarter inch thick "sponge" of dust from between the fan and the heat sink. Dell actually has downloadable service instructions on their website that explain all the steps to open up the machine and get inside for this kind of service. If you're not comfortable with this, you probably have a techie friend who'll help you out. It's definitely worth the risk if you're planning to get rid of the machine otherwise!
If you think this might be what's happening on your machine, try downloading a utility like "SpeedFan" that allows you to check the temperature of your CPU as well as other components. With this app, you can graph the temperatures when you first start the machine. If they start climbing quickly and never seem to decrease, you can bet cooling is an issue. In my case, I also used a free app called "CS Fire Monitor" to show me the actual speed of my processor and I found that once it got hot, it was dropping to less than half speed. There's lots of good freeware out there that will show you this kind of information; just Google "CPU Temp Freeware" or "CPU Speed Freeware" or something along those lines and you'll find all sorts of options.
Hopefully, this will save a few people from replacing or throwing away decent hardware that just needs some respiratory therapy!
Solution 2:
There are a few effects here:
- Your perception of how fast the computer should be is changing. When you first get new hardware you have something concrete to compare it against - the old hardware. This gives you an empirical measure of the speed improvement. As time goes by your memory of how slow the old hardware was fades you only have how fast the current hardware was recently to compare against.
- New versions of software come out which add new features to either extend functionality or make use of the new hardware. This will be, by definition, a larger program than before which will take up more resources thus causing your hardware to run a little bit slower.
- Accumulation of drivers, programs/tasks running in the background etc. Each additional driver/background task takes up a little bit more resource - hard disk space, memory, CPU cycles etc. While each one isn't large the effect is cumulative. People expect modern programs to update themselves so there are extra tasks running that you aren't aware of. The longer you have the computer the more of these programs you are likely to have installed.
When taken together they give the impression that the hardware is slowing down.
There may be other effects due to wear and tear on the hardware (disk fragmentation, memory latency) too.
Solution 3:
When I have run benchmarks (both trivial ones like bogomips, and more serious one like Dhrystone and Whetstone) on five to eight year old hardware, I have always found that it turned in the same results as when it was new. (Always on Linux and Mac OS boxen, BTW.)
I have less experience with hard drives, but I did test one fast and wide SCSI2 drive about five years on (with hdparm
) and got answers comparable to the original spec.
So, I think it is mostly, as others have said, a combination of new expectations and heavier software.
That said, I do currently have a powerbook G4 which could use testing, as it sure feels slower now than it used to. The suggestion above that clock throttling may come into play if the cooling system gets fouled is a good one.