How does Windows determine how long it takes to perform a given action on a file?

I wanted to know if there's an equation that Windows uses to determine how long it takes to perform an action on a file, say delete, to copy, to erase, or to install.

enter image description here

For example, when I'm deleting a file, and Windows says "Time remaining: 18 seconds" how is it calculating this number, and using what?


Solution 1:

Have you noticed that usually it doesn't give you any estimates in the first seconds?

That's because in the first seconds, it just does the operation it has to do. Then, after a (small) while, it knows how much it already copied/deleted/etc, and how long it took. That gives you the average speed of the operation.

Then, divide the remaining bytes by the speed, and you have the time it will take to complete the operation.

This is elementary school maths. If you want to travel 360 km, and at the end of the first minute you travelled 1 km, how much will it take you to get to your destination?

Well, the speed is 1 km/minute. That's 60 km/h. 360 km divided by 60 km/h gives you 6 hours (or 360 km / 1 km/minute = 360 minutes = 6 hours). Since you've already travelled for one minute then the estimated time left is 5 hours and 59 minutes.

Substitute "travel" with "copy" and "km" with "bytes" and that's your question.

Different systems have different ways of estimating time. You can take the last minute, and the estimates may vary wildly, or you can take the full time, and if the speed actually changes permanently, your estimates may be far off reality. What I described is the simplest method.

Solution 2:

answering with a simple cross-multiplication is awfully condescending I think, I'm sure that he already knew that, it's how we constantly guesstimate things in our head too.

The problem with file-operation progress bars is that it's only correct for uniform data, so if you copy 100 files that have all the same size and your drive is doing nothing else, the estimated progress will be spot on, but what if the first 99 files were small txt-files and the last one is a large video file? The progress will be WAY off.

This problem is further increased when you're not handling files in one folder, but multiple sub folders. Say you have 5 subfolders and you want to delete them (size doesn't matter much in this case then), the first 4 folders only contain less then 10 files, so by the time the operation comes to the 5th folder it thinks it's about 80% done, and boom 5th folder contains 5000 files and your progress jumps back to 1%

WinXP tried to get around this by counting the number of files beforehand which meant that when the folder wasn't indexed in windows, depending on the number of files, XP didn't really start the operation for the first 20seconds (time it took to count) which made everybody furious.

So while I also don't have special knowledge on how Windows does it (but what else is there apart from counting files and bytes) I hope I could illustrate why it's flawed and why it never will be perfect.

Best you could do would be to not rely solely on filecount OR bytecount, but build an average out of the two.

Or if you wanted to go extra crazy the OS could start a database of how long these operations took in the past on your machine and factor that into the equation.

Final thought: If someone would think of a filesystem that would let the OS know what size every folder has, without calculating it first, you would at least get a correct progress estimation when deleting whole folders and not just parts of it.