Compressing many similar large images?
Solution 1:
I don't know of a specific software that does this, but there is some research on the subject. For example, see the articles Compressing Sets of Similar Images by Samy Ait-Aoudia, Abdelhalim Gabis, Amina Naimi, and Compressing sets of similar images using hybrid compression model by Jiann-Der Lee, Shu-Yen Wan, Chemg-Min Ma, Rui-Feng Wu.
On a more practical level, you could extend your subtraction technique, for example by writing a script that uses ImageMagick to compute the difference between consecutive images, saving the result as a jpeg (or a compressed png if you want it lossless). You'll get one base image and a set of compressed "delta" images that should be much smaller. To compute the difference using ImageMagick:
convert image2.png image1.png -compose MinusSrc -composite -depth 24 -define png:compression-filter=2 -define png:compression-level=9 -define png:compression-strategy=1 difference-2-1.png
To re-compute by adding back:
convert image1.png difference-2-1.png -compose Plus -composite image2-reconstructed.png
(You can do the same using jpg instead and save a lot of space).