On Wednesday 11 March 2015 06.28:23 Nico Haslberger wrote:
> Hello guys,
> is there something like a best practices list available when using 
> compression in filesets?
> I want to optimize my filesets with compression excludes, because a full 
> backup of my Owncloud server (33.16GB written) took about 6h.
> On my Owncloud there are many binary files that don´t need to be compressed 
> like *.mp3, *.jpeg, *.gz, etc.
> Do you know a list that shows the efficiency of compression methods?
> I don´t want to search my whole Owncloud for binary filetypes manually...
> 
> So in summary: 
> Some best practices would be nice to have, because I think I´m not alone with 
> this difficulty. :)
> What excludes and compression algorithms do you use?
> 
> Thanks and greetings
> Nico
> 
> 
In my case I don't filter (okay most of my -fd are bare metal) compressed file 
or not
lzo and lhac provide good speed for the ~ ratio of global compression
without tearing down all cpu power on the -fd.

Just a question if you have not that much compressible file why then bother to 
use compression? 
If most of your files are already in compressed format, you're just spending 
time for almost no gain...

What global compression is returned by the job?
Try job speed between compression on and off and try to determine which between 
place and speed is the most appropriate.

gzip2 is also quick and efficient.

-- 

Bruno Friedmann 
Ioda-Net Sàrl www.ioda-net.ch
 
 openSUSE Member & Board, fsfe fellowship
 GPG KEY : D5C9B751C4653227
 irc: tigerfoot

-- 
You received this message because you are subscribed to the Google Groups 
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to