This is a quick follow-up to my Beyond ZIP Part II – Data DeDuplication Archives for all Windows versions via ZPAQ post, showing some new benchmarking info.
Background: ZPAQ is an open source 7-Zip type archiving utility which supports file level data deduplication archiving for all current Windows versions, making it extremely useful when needing to transfer lots of virtual machines over WAN / Internet links. ZPAQ supports six different compression levels, from 0 – 5, and I wanted to see the difference in terms of time to compress/extract and the resulting file size.
Note: You can also use native Windows Server Data Deduplication to transfer data, but that is only supported on Windows Server, and is server version specific. However, if you're interested in that, I have blog post about that here: Beyond Zip – How to store 183 GB of VMs in a 16 GB file using PowerShell.
The Benchmarking
In this example I was archiving a ConfigMgr lab setup for transfer to a training center in New York: Three Windows Server 2016 VMs, one Windows 7 VM (for upgrade tests), and an ISO file with some student lab files. The original files were 143 GB in size. The archiving was done with the x64 version of ZPAQ, on a machine with a SSD, 64 GB RAM, and a i7-8700K CPU @ 3.70GHz. The x64 version of ZPAQ automatically detects the number of processor cores and uses all of them, which is 12 cores on my test machine.
The ZPAQ version tested was v7.05.
Test #1 – ZPAQ, 143 GB files, compression level 0 (-method 0)
- Archiving time: 18 min, 11 sec
- Resulting file size: 70.8 GB
- Extracting time: 24 min, 30 Sec
Test #2 – ZPAQ, 143 GB files, Compression level 1 (-method 1)
- Archiving time: 17 min, 49 sec
- Resulting file size: 36.5 GB
- Extracting time: 22 min, 25 Sec
Test #3 – ZPAQ, 143 GB files, Compression level 2 (-method 2)
- Archiving time: 41 min, 43 sec
- Resulting file size: 35.1 GB
- Extracting time: 22 min, 58 Sec
Test #4 – ZPAQ, 143 GB files, Compression level 3 (-method 3)
- Archiving time: 51 min, 11 sec
- Resulting file size: 33.2 GB
- Extracting time: 30 min, 15 sec
Test #5 – ZPAQ, 143 GB files, Compression level 4 (-method 4)
- Archiving time: 1 hour, 48 min, 9 sec
- Resulting file size: 31.7 GB
- Extracting time: 1 hour, 47 min, 45 sec
Test #6 – ZPAQ, 143 GB files, Compression level 5 (-method 5)
- Archiving time: 8 hours, 7 min, 11 sec
- Resulting file size: 30.6 GB
- Extracting time: 8 hours, 12 min, 12 sec
ZPAQ Conclusion
From a performance / size point of view, compression level 1 is a good choice. But as you see above, you can indeed reduce the final archive size with a few GB if you don't mind waiting an hour or two more. Using the highest compression level, -method 5, is just not worth the time IMHO.
Obviously the compression rates depends on the content, a VM archive of 25 Windows 10 clients is only going to be the size of one compressed Windows 10 client.
Quick 7-Zip comparison
For the fun of it, I also did a quick 7-Zip comparison by zipping the same files with 7-Zip v18.05 on Normal and Ultra compression settings.
Test #1 – 7-Zip, 143 GB files, Normal compression level
- Archiving time: 1 hour, 10 min, 2 sec
- Resulting file size: 69.3 GB
- Extracting time: 8 min, 6 Sec
Test #2 – 7-Zip, 143 GB files, Ultra Compression level
- Archiving time: 1 hour, 44 min, 19 sec
- Resulting file size: 67.6 GB
- Extracting time: 8 min, 21 Sec

Happy Deployment, / Johan