How resilient is the .7z file format?

Any compression tool is going to be subject to corruption issues on very large files. Your best bet is probably to use smaller files, but NOT volumes of a larger archive - as separate archives.

AFAIK 7zip will lose the whole archive if you have file damage in a portion of the archive.


If you want redundancy in your compression, I would suggest using rar/par/par2 files. This has been a standard for compression redundancy for files sent over newsgroups and a lot of other sources. You split your files into many rar files... and you can even be missing entire rar files and still recover your data. For data that doesn't compress well, this could actually increase the total size, but that is the price you pay for redundancy.


7-zip will lose the whole archive even if there are only minor corruptions. This is because 7-zip only use solid compression, which means that all files are agglomerated together. However, 7-zip authors offer a tutorial on how to manually try to fix your 7zip archive here.

If you want to be able to recover non corrupted files from a corrupted archive, you have to make a non-solid archive, such as a zip with DEFLATE. I tried several formats, including ARC which allows non-solid archives, but it was less resilient than zip. There is also the PEA format (by PEAzip) which allows for partial extraction, and RAR (by WinRAR) which specifically has an option "keep broken files" to allow partial extraction.

You can try by yourself various compression formats and see if you can still uncompress your data using a simple data tampering python tool.


Depending on your needs it might be better to introduce the redundancy on another level. What I want to say is you rather think about another complete copy of the files rather than trying to alleviate partial damage. Then you regularly check the checksums of these files and whenever a problem arises you replace the defective hardware and copy from an intact backup again.