Most resilient container for archiving many small files
Option 1: ISO
ISOs work across different OS. Easy to append new files. Better than archive files.
Unlike archive files (e.g. TAR) where you first need to unpack the entire thing to access its contents in its entirety, which can take a significant amount of time with that many files, you don't need to unpack an ISO at all. Just mount the ISO in the filesystem and read directly from it. Takes less than a second and all the data is immediately accessible.
You can create ISOs with Folder2Iso or mkisofs directly on the command line.
(Credits to u/ImJacksLackOfBeetus)
Option 2: TAR with simplest settings
Tar has literally been designed for this use case. Just use the simplest settings (i.e. per-file metadata, no compression, etc.). It's easy to append files to the archive as well.
If some bytes get corrupted, extract the archive as follows:
Using pax to extract all the files with intact metadata:
pax -r -v -E 3 -f broken.tar > broken.log 2>&1
with E being the number of times you want to retry when there's an error (probably fine checking once). You can then check the log for where there are broken headerspax: Invalid header, starting valid header search.
and you can try and recover that specific file manually. Unfortunately it doesn't tell you where exactly in the archive the error is but you can find it by the files that were extracted before and after the error. You'll still need to check the extracted files for corruption yourself though. (Credits to u/askingforeafriend)
Credits and more info can be found in this Reddit thread.