How often should you defrag a Server?

I in fact never defragment the data on my servers. I haven't seen enough of a performance gain in file serving to make it worth the performance hit of the time it takes to defrag. In fact most servers won't ever really finish defragmenting unless you take them offline for a few days. If you're using a relatively modern file system (which unless you chose to change the defaults on Windows 2003 you are) it shouldn't matter much anyhow. Also if you're running any sort of striping RAID the fragmentation of files is a non issue since they're already broken down across many disks.

If I have a server where I really want the data clean and defragmented for some reason I am far more likely to back it all up to tape, wipe the drive and restore it. That will write them all down in perfect blocks.


Just about the only use-case I know of for defragmenting a Windows server is to improve backup performance. Backups are just about the only large scale sequential I/O a file-server does, and that's the kind of I/O that notices fragmentation. The kind of I/O file-servers do when users are hitting them is very random, and in that case fragmentation can sometimes improve performance.

At my old job we had a file-server that we'd just migrated to new hardware. Immediately after the migration, the backups were running on the order of 450MB/Minute (this was many years ago, mind). Two years later, that server was backing up around 300MB/Minute. We then defragged it for the first time, and speeds rose back to 450MB/Minute again.

If you're having trouble getting all of your backups done on time, and it looks like it is the server being backed up that's the bottle-neck, a defrag may help with that.

The other use-case for defrag is a backup-to-disk system with the archive stored on NTFS. Backup and restore on that kind of volume is entirely sequential, and that notices fragmentation. However, if the underlaying storage is abstracted enough (such as an HP EVA disk array) even this kind of I/O won't notice fragmentation.

What this all boils down to is that massively sequential I/O is the type of I/O that notices fragmentation the most. If that's not the I/O you're concerned about, then defragging isn't a concern.


I would concur that you often don't need to and shouldn't if performance is your goal (the constant defrag doing more harm then good).

Like any rule however there are some exceptions:

If you are, or at some point where very low on disk space (<15% free), then you should probably do a defrag when there is time. Even modern file systems have trouble avoiding fragmentation when there are so few sectors to choose from.

If you are running specific types of applications that cause unavoidable fragmentation you may wish to invest in a server specific defragmentation program (these are designed to run continuously in the background and defrag when/if needed). The type of application that would cause unavoidable fragmentation in a Windows environment would be those that do a lot of lazy writing across multiple files (most robust server designed software avoids this, but something like a desktop download manager, especially some specific BitTorrent clients, exhibit this kind of aggressive fragmentation behavoir)


I ran Diskeeper on the servers in an earlier job and so a measurable performance improvement on both file servers and applications servers. I don't think we got near their published stats but we definitely saw some benefits.

It was set to defrag on idle and in set schedules to limit impact with some additional bits that kicked in at boot time.