On Wednesday 27 March 2002 04:07 pm, John R. Jackson wrote:
>>v) Gene said: "Any partition thats that much gzipped already,
>> should have the compression turned off, doing a straight tar
>> of it."
>>Again, I'm a newbie, and if I decided not to use tar, was not
>> based on my own experience but from what I've read elsewhere.
>
>Not trying to speak for Gene, but I don't think he was talking
> about tar vs. dump.  I think he was pointing out that if your
> data is already compressed, sending it a tape drive that was
> set up to do hardware compression is not going to work well. 
> Compressing compressed data a second time usually ends up
> expanding it.

That was what I was trying to say.  Also, again generally 
speaking, the level of compression that is done by the hardware 
chips can average 2 to 1 under /ideal/ conditions as the hardware 
is usually some variation of the 2-7 RLL method, very easy to do 
at high rates of speed in hardware.  Some of my more sparsely 
filled partitions can be set for 'compress server best', with the 
output of a full level 0 being reduced to less than 15% of the 
original size before compression.  My downloads directory, full 
of tar.gz and .bz2 files will expand 180+% using the same 
compression setting.

Use amanda's emailed report to see if the compression should be 
turned off for that entry in the disklist.  Unforch, when the 
drive is doing the compression, you have no handy compression 
ratio feedback mechanism and must rely on your own experiences.  
And with the drive doing the compressing, amanda cannot develop 
any info about how big the tape might be in terms of uncompressed 
input data, it simply measures what it puts into the pipe.  With 
amanda doing the compression, she measures the tape capacity in 
terms of the already compressed data because its that data that 
actually goes onto the tape, not the much larger input data.  So 
a 20gig tape will usually hit 18 to 19 at least, but thats in 
terms of already compressed data.  OTOH, using a different 
program that did its own compression, I've watched in dis-belief 
as 11 gigs worth of text type data actually fit a DDS-2 120 meter 
tape, normally rated for 4 gigs.

>I use hardware compression for the reasons you mentioned -- it
> would kill my systems to chew up that much CPU.  But if you
> have a wide variety of data (some file systems with lots of
> text that compresses really well, and others with already
> compressed data), then using software compression with Amanda
> has the major advantage that you can pick to compress or not on
> a file system by file system basis.
>
>>I'd like to know your points here. I have avoided tar, because
>> earlier on when I did backups manually (i.e. without amanda) I
>> had a few problems with tar (FreeBSD tar, v1.11.2) ...
>
>Tar vs dump is a periodic war on this list.  Rather than start
> that up yet again :-), I suggest you do one of two things:
>
>  * Use what you're comfortable with regardless of what anybody
> else says :-)
>
>  * Read through the archives for some of the pros and cons of
> both.

But seperate the old version squawks from the current version 
squawks.  Somebodies 2 year old lamenting about 1.11-xx of tar 
has no validity to what tar-1.13-25 does today. Put another way, 
consider what you read only if it applies to the version you have 
on hand.  And 98% of the time, newer is generaly better.  The 
other 2% will eat your lunch and leave you to pay the bill anyway.

-- 
Cheers, Gene
AMD K6-III@500mhz 320M
Athlon1600XP@1400mhz  512M
98.7+% setiathome rank, not too shabby for a hillbilly

Reply via email to