Is defragging relevant to improving disk performance anymore?

Does it stil make much difference, now that we have much faster and larger HDDs? Generally I find if disk performance is getting too slow, it's probably because of too little free space, and the solution is to buy a new HDD.


Solution 1:

It is still relevant, but since the release of Vista, Windows has automatically defragged your hard drive when the computer was sitting idle. If you leave your computer idle for long periods, this is no longer something which must be done manually.

Solution 2:

It's not relevant on SSDs, which are hardly (if at all) affected by defragmentation (no read-heads to be moved around). And even more: SSDs have a much more limited number of times each part of the "disk" can be written to ("write endurance"; What is the lifespan of an SSD drive?), so especially for cheap SSDs it's best to use the whole disk, rather than keep using the same parts over and over again by running defragmentation tools.

For those who've partitioned their hard drive: didn't that introduce a huge fragmentation?

And though the question is tagged "windows", for those who get here given the generic title: it is not relevant for each file system. Like for a Mac (emphasis mine):

You probably won't need to optimize at all if you use Mac OS X. Here's why:

  • Hard disk capacity is generally much greater now than a few years ago. With more free space available, the file system doesn't need to fill up every "nook and cranny." Mac OS Extended formatting (HFS Plus) avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space.

  • Mac OS X 10.2 and later includes delayed allocation for Mac OS X Extended-formatted volumes. This allows a number of small allocations to be combined into a single large allocation in one area of the disk.

  • Fragmentation was often caused by continually appending data to existing files, especially with resource forks. With faster hard drives and better caching, as well as the new application packaging format, many applications simply rewrite the entire file each time. Mac OS X 10.3 Panther can also automatically defragment such slow-growing files. This process is sometimes known as "Hot-File-Adaptive-Clustering."

  • Aggressive read-ahead and write-behind caching means that minor fragmentation has less effect on perceived system performance.

[..]

There is also a chance that one of the files placed in the "hot band" for rapid reads during system startup might be moved during defragmentation, which would decrease performance.

Of course, the latter would not occur when using a Defrag utility that is included with the operating system.

Solution 3:

Yes.

On my friend's PC (XP, NTFS, 80GB HD, ~750MB free), the 700MB pagefile was in 11,000 fragments, and the MFT was in 40 fragments. I cleaned up the hard drive, defragmented, and rebooted, and the difference was very noticeable.