Do computers slow down as they age?
Most of the time when a computer seems to have gotten slower, software is to blame.
Frequently lots of applications have been installed over the years and a computer has become cluttered with login items and background processes. Even when the offending apps are no longer in use, their daemons and updaters can stay alive.
When this happens, the computer really does slow down. Disused apps living in the background and competing for resources takes away from the OS and the apps you really do want to use and can slow things down. The more of this 'crud' that builds up, the slower the system can feel.
Contributing to the problem, hard drives do slow down a little bit as they fill up. If so many apps have been installed and files have been saved that the computer is seriously low on free space, performance can be impacted.
The other major component of this is that, as it is updated, software gets more and more complex and requires better and better hardware. When an app is updated, frequently it has some new features that take more power to run. Think about how OS X has gradually had more animation, graphics, and background functions added. The extra tax of having something like Autosave running constantly in the background could seriously impact a very old system, and something like Mission Control could be completely impossible.
However, since you've completely purged your system, none of these things are affecting you.
Hard drives do not really slow down over time; either they keep working well or they fail completely. See this Super User post for more details.
The same goes for other components: they keep working much the same until they die.
It is entirely possible that your expectations are playing a role here.
Tasks that would be strenuous on your eMac are trivial to your current computer, so you expect them to be instant. When they're not (because the computer has to work harder to get them done), it feels slow.
The other potential factor is the fact that (even though you're using old software) you're interacting with things that have been updated and are expecting users to be on newer systems.
Web pages have more images, JavaScript, and AJAX calls then they did in 2004, all of which take power to run. Images, videos, and music are recorded at higher resolutions and stored in less-compressed/higher-fidelity/more CPU intensive formats. Text documents are trending to become richer and have more formatting and metadata without considering that more media is likely to be embedded into them.
So, here's the test: If your recently-wiped computer feels slower when using the same apps, files, and websites you used in 2004, your expectations have changed.
Otherwise I would be inclined to blame crud, software surpassing hardware, the hard drive filling up, and interacting with newer resources (in that order).
No. Most workloads, the processor is what governs performance and the CPU itself remains measurably stable over time in terms of clock rate.
The OS and memory and storage are largely stable. Storage would be the first thing to slow with age, so benchmark that if you can.
Our expectations of what make for a fast computer however do not slow down. Camera resolution grows, image fidelity grows, more logic is stuffed into a web page, etc...
Software can become corrupt over time, but the computer itself can be returned to factory conditions with an easy erase and install to confirm that the hardware is still speedy even if we don't feel it's so fast compared to all the new young gun computers that are now shipping.
Now from a purely analytical standpoint, here are the items that could conceivably be measured in benchmark or manufacturer tests:
Heat Dissipation failures (thermal grease drying out or dust/worse accumulation) could cause CPU temps to rise when airflow can't remove enough heat from the chipsets. Older CPU didn't throttle as much and had larger metal traces to get the heat out. Newer CPU like the MacBook Air actively reduce cores and clock cycles to thermally control the CPU. This makes for a very noticeable performance dip when the CPU cannot be kept cool. I have seen tower and iMac cases packed with dust or worse / nicotine tar laden dust. So much so, that the computer had no business even running by the looks of it - but other than hot sensors, most still ran full speed (to my great astonishment).
Hard Drive imminent failures - the expected bad block relocation behavior when a drive has to take more time to land data can be noticeable, but I categorize that as your computer has a bad drive - not that it's old and slow. This can happen to a brand new computer with a bad drive until it triggers the SMART status or just fails outright.
Things that might physically slow an "old" computer:
- Drive fragmentation. Should not be an issue if you reinstall from scratch.
- Flaky hard drive sectors. Difficulty reading and/or writing sectors causing retries would definitely slow things down, although perhaps not in a uniform way. You might try zeroing the drive to force spare sectors to come online, but all drives will fail eventually.
- Flaky optical drives. Optical drives have reliability slightly better than tissue paper it seems.
- Thermal control. Something with the fans, heat sinks, etc is not working properly or optimally, causing the CPU or some other component to automatically ratchet down. It might be something as simple as dust in the intakes or on the heat sink fins.
If this was a machine with a BIOS, a battery failure might cause the default settings to take effect, which might not be optimal. Dunno if a Mac's NVRAM is fraught with the same peril.
Stob's Index Of Cruftitude may help. Humour obviously, but with a grain of truth.
Although computers/processors slow down over time aka transistor aging, the slowing effect felt by the user is in fact because of the software updates and getting used to faster computers.