Why do 7-zip / WinRAR unzip files to "temp" before moving them to their destination?

How exactly are you extracting the files? Are you using the command-line or the GUI? Are you dragging the files or selecting them and using the extract function? Are you using the shell-extension context-menu?

If you enter a destination folder and then select the extract function or use the shell-extension, then they do not extract to a temporary folder first, they extract directly to the destination.

If you select the files in the UI and drag them to the target folder, then it will extract to a temporary folder.

The reason is in how the destination is selected. If you enter the target folder or use the context-menu item, then the program knows exactly where it needs to extract to. However, if you merely drag the files, then due to how the drag-and-drop function of OLE works, the program does not know where the target folder is. In other words, it is Explorer that receives the target folder, not the archiving program. As a result, the program cannot know where to extract them, and so simply extracts them to the temp folder, then Explorer moves them once it’s done. You can see this clearly by extracting a large file using both methods. When you drag it out to a folder, it extracts, then you see Explorer’s standard file operation dialog moving it to the folder. If you specify the folder and click Extract, it extracts and no further processing is done.

Feel free to peruse the source-code for 7-Zip to see how extraction location is handled.


I learned this the hard way several years ago when I wanted to implement drag-and-drop in a program I was writing.


It is done so memory requirements for decompression are kept to a minimum.

If they didn't use the filesystem, decompression would happen in memory. Under low memory conditions, or for large compressed files this would sooner or later exhaust available memory and start the process of memory paging.

Paging under these circumstances would be a lot slower than just using the filesystem because the file is still being decompressed (and page files keep being added), but also because as the file is being decompressed, it is being checked for errors and there's as such a lot of read/write operations. The worst thing that can happen to a page file.

EDIT: Regarding the use of a temporary directory, this is so to follow many operating system guidelines. If the decompression fails, there's no guarantee the program performing the operation cleans up after itself. It may have crashed for instance. As such, no residual file remains in your target directory and the operating system will dispose of the temporary file when it sees appropriate.