How to measure performance gain with defragmentation?

I believe I understand the basic concepts of fragmentation of data on hard-drives, and the concept of defragmentation to counter the effects of this. What I don't really understand is how one actually measures the performance gain by defragmenting files.

Some say the system reacts "snappier" or that things load faster after running a defragmentation. I don't feel this is always the case. I've run defrag many times on different pcs without noticing any noticeable performance gain at all.

So I'm wondering, is there any way to actually measure how much performance difference is before / after a defragmentation, and what the ACTUAL impact on the systems performance is?

**Update:**What I'm looking for is a tool that can give me some concrete indication of overall system performance improvements. Is this achieved best through benchmarking tools specific to HDD access speeds? Or will gain the best result through an application like File Access Timer from Raxo? Also I'm running windows XP


Solution 1:

Measuring the performance gain from defragmentation is rather difficult, however there are some utilities that are meant to "aid" you with it.

There's the utility called File Access Timer, from Raxco, available here. This tool will read a certain file/folder a set amount of times and display how long it took, along with the fragments.

Excerpt from the readme

The File Access Timer allows you to select a file or folder and read the contents several times in order to measure the performance gain achieved through defragmentation. The general process is to select a fragmented file, read the file using Raxco's File Access Timer, defragment the file, and re-measure the time needed to read the file. By doing this you can see the benefits of defragmentation for yourself.

Solution 2:

Fragmentation only affects you if you have to read large segments of a file, and the fragmentation makes you seek all over the disk for them. If you never read more than a single disk cluster or so in one go, then fragmentation is irrelevant, because you'll have to see anyway due to other activity since your last read.

Program files are one that is likely to affect you, because they tend to be larger, and they tend to read all in one shot. If your program files are fragmented, then that could slow loading of the program, possibly into the human noticeable realm.

If you want to measure the effects of fragmentation, write a program that reads a large file from beginning to end, repeatedly. Do 1000 runs or so to smooth out the noise. Now defragment the file and do it again. See if the average read time goes down.

Solution 3:

The only measure I've ever learned to use is the "Split I/Os" performance counter in the "Physical Disk" category in perfmon. It measures the number of I/O requests per second that had to be split into two or more separate requests because they disk blocks they were looking for were not contiguous.