Job: merge different versions of this same game
Goal: find best method that will require smallest user effort for compression and decompression (and perhaps don't use too much of system memory)
Test #1:
Game: [PSX] Final Fantasy IX (Disc 1) (8 versions)
Versions order used for merging: U v1.0, U v1.1, E, F, G, I, S, J
Results table:
Brief summary:
- 5,5gb - uncompressed
- 2,8gb - PackIso
- 379mb - ECM+Split:100mb+Rep:200mb+LZMA:32mb in ~10 minutes (~250mb for decompression!)
- 375mb - ECM+Rep:1gb+LZMA:128mb in ~25 minutes (~1gb for decompression)
- 354mb - ECM+Rep:1gb+NanoZip:1.5gb in ~40 minutes (~1.5gb for decompression)
- 344mb - ECM+Split:100mb+Rep:1gb+NanoZip:1gb in ~30 minutes (~1gb for decompression)
One version of game & ImageDiff for others:
- 358mb - 7-Zip:192mb
- 344mb - NanoZip:1.5gb
Some notes:
- times were measured on dual core 2.5ghz, and accuracy is about 90%
- it will be possible to use ECM inside FreeArc, but we need to wait for next more stable version - that will reduce whole process into one command
Basic idea of splitting:
- split data into parts, f.e. v1.bin.001, v1.bin.002, v2.bin.001, v2.bin.002
- add all to archive sorted by extension and name: v1.bin.001, v2.bin.001, v1.bin.002, v2.bin.002
- apply repetition filter with at least twice large dictionary than part size (so if you have parts for 100mb you need at least 200mb dictionary for it to work good)
Conclusions:
- no need for storing by ImageDiff when merging with repetition filter is much more conveniant
- with splitting you can achive amazing results with cost of convenience...
- buy more RAM - there's never too much when it comes to compression