There's been a little back and forth here but basically my concern was whether using software compression would slow restores. I just did a little test and it seems the converse is in fact the case, at least for my one test - not dreadfully scientific, I know.
I created a tar file of my home directory on my home machine which came to 2.1G. I used gzip to compress it and got a 1.3G file. Then I put each file onto a DDS-2 tape (because that's the tape drive I have at home) using dd just as amanda does, and extracted one file from each tape with dd|tar x in the uncompressed case, and dd|gunzip|tar x in the compressed case. The restore took 59m22s elapsed time for the uncompressed file and 44m48s elapsed time for the compressed file. It would appear that there's some overhead there for gunzipping (because 59m22s x (1.3/2.1) is somewhat less than 44m48s) but in fact I don't believe there is. I also checked how long it took to dd the compressed file to the tape and it was near as damnit the exact same as the extract time. I put the difference down to the uncompressed tar file being written to the tape in compressed mode (I didn't force it, and I didn't check - I told you this wasn't a scientific test :-) ) Anyway, the bottom line is that it has convinced me and I'm going to switch to using client compression and turning hardware compression off on the tape drive. Regards, Niall O Broin
