Re: [BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread John Goerzen
Nils Breunese (Lemonbit) wrote:
>> Right, I'm aware of that.  But that's a specialized tool.  It requires
>> CPAN libraries, libraries from BackupPC's perl library, etc.  20 bytes
>> or so would get you something that gzip could uncompress.
> 
> Are you sure? I thought it also works on its own. Haven't tried it  
> though.

Yes, quite.  I did test it.

> 
> Well, maybe because it was never implemented and you're the first  
> person to ask about it. :o)

Fair enough.  My Perl skills are extremely rusty though.

-- John

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread John Goerzen
Chris Robertson wrote:
> John Goerzen wrote:
>> extract it without using specialized tools.  It also makes me nervous
>> because it isn't a completely off-the-shelf implementation, and
>> doesn't appear to store a CRC in the file; is there integrity checking
>> anywhere?
>>   
> 
> http://backuppc.sourceforge.net/faq/BackupPC.html#compressed_file_format
> 
> and
> 
> http://backuppc.sourceforge.net/faq/BackupPC.html#backuppc_operation

Hi Chris,

Thanks for the pointers.  I had actually already seen that, but still
concerned about the lack of a CRC.  gzip doesn't have the problem that
the document is referencing, but even if it did, there are Perl modules
that can put a nice gzip header on the file.

>> Third, I'm wondering how well BackupPC deals with sparse files.  I
>> notice that the examples for tar are not giving -S to detect sparse
>> files.  Does BackupPC store sparse files efficiently, even if not
>> using compression?  On restoration, what will it do with sparse files
>> -- will it re-create the holes?
>>   
> 
> See 
> http://backuppc.sourceforge.net/faq/BackupPC.html#how_to_backup_a_client.  
> Specifically "$Conf{TarClientCmd}" or "$Conf{RsyncClientCmd}".  Adjust 
> the tar (or rsync) command to your preference.

That doesn't seem to be a full answer.

tar with -S will detect sparse files and write them to the archive that
way.  So if I back up a client with tar, it will send a compact
representation over to the BackupPC disk.  BackupPC, though, extracts
the files out of tar, compresses them, and writes them to disk.  To my
line of thinking, that completely eliminates the data about where the
sparse holes were.

This isn't a show-stopper, but it *is* vital information to know.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread Nils Breunese (Lemonbit)
John Goerzen wrote:

> On Wed, Nov 05, 2008 at 10:40:35PM +0100, Nils Breunese (Lemonbit)  
> wrote:
>> John Goerzen wrote:
>>
>>> First, the on-disk compression format makes me nervous.  It  
>>> appears to
>>> use the deflate algorithm, but cannot be unpacked with either gzip  
>>> or
>>> unzip.  It would seem that the few bytes that adding a gzip header
>>> means would be well worth it, since it would buy the ability to
>>> extract it without using specialized tools.
>>
>> You can use the BackupPC_zcat binary to decompress individual files
>> manually.
>
> Right, I'm aware of that.  But that's a specialized tool.  It requires
> CPAN libraries, libraries from BackupPC's perl library, etc.  20 bytes
> or so would get you something that gzip could uncompress.

Are you sure? I thought it also works on its own. Haven't tried it  
though.

> There's still the question of a CRC.

I don't know about that.

>>> Secondly, I would love to be able to use bzip2 for the on-disk
>>> compression of each backup.  It appears that bzip2 can be used for
>>> archives, but not the regular backups.  Is this in the works  
>>> anywhere?
>>
>> In my experience bzip2 is a much more resource intensive algorithm
>> than gzip. I wouldn't want to use that on my backups, they'd take
>> forever to compress.
>
> Sure, it's resource-intensive, but why not offer it?  I've got a Core
> 2 Duo, and in tests using dar (similar to tar) with per-file bzip2
> compression, it performs well enough and has a space savings in the
> neighborhood of 20% over gzip on my test set.
>
> It will certainly not be right for everyone, but why not the choice?

Well, maybe because it was never implemented and you're the first  
person to ask about it. :o)

Nils Breunese.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread Chris Robertson
John Goerzen wrote:
> Hi everyone,
>
> I installed BackupPC to try it out for backing up Linux systems, and I
> have a few questions about it.
>
> First, the on-disk compression format makes me nervous.  It appears to
> use the deflate algorithm, but cannot be unpacked with either gzip or
> unzip.  It would seem that the few bytes that adding a gzip header
> means would be well worth it, since it would buy the ability to
> extract it without using specialized tools.  It also makes me nervous
> because it isn't a completely off-the-shelf implementation, and
> doesn't appear to store a CRC in the file; is there integrity checking
> anywhere?
>   

http://backuppc.sourceforge.net/faq/BackupPC.html#compressed_file_format

and

http://backuppc.sourceforge.net/faq/BackupPC.html#backuppc_operation

> Secondly, I would love to be able to use bzip2 for the on-disk
> compression of each backup.  It appears that bzip2 can be used for
> archives, but not the regular backups.  Is this in the works anywhere?
>
> Third, I'm wondering how well BackupPC deals with sparse files.  I
> notice that the examples for tar are not giving -S to detect sparse
> files.  Does BackupPC store sparse files efficiently, even if not
> using compression?  On restoration, what will it do with sparse files
> -- will it re-create the holes?
>   

See 
http://backuppc.sourceforge.net/faq/BackupPC.html#how_to_backup_a_client.  
Specifically "$Conf{TarClientCmd}" or "$Conf{RsyncClientCmd}".  Adjust 
the tar (or rsync) command to your preference.

> Thanks,
>
> -- John

Chris

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread John Goerzen
On Wed, Nov 05, 2008 at 10:40:35PM +0100, Nils Breunese (Lemonbit) wrote:
> John Goerzen wrote:
> 
> > First, the on-disk compression format makes me nervous.  It appears to
> > use the deflate algorithm, but cannot be unpacked with either gzip or
> > unzip.  It would seem that the few bytes that adding a gzip header
> > means would be well worth it, since it would buy the ability to
> > extract it without using specialized tools.
> 
> You can use the BackupPC_zcat binary to decompress individual files  
> manually.

Right, I'm aware of that.  But that's a specialized tool.  It requires
CPAN libraries, libraries from BackupPC's perl library, etc.  20 bytes
or so would get you something that gzip could uncompress.

There's still the question of a CRC.

> 
> > Secondly, I would love to be able to use bzip2 for the on-disk
> > compression of each backup.  It appears that bzip2 can be used for
> > archives, but not the regular backups.  Is this in the works anywhere?
> 
> In my experience bzip2 is a much more resource intensive algorithm  
> than gzip. I wouldn't want to use that on my backups, they'd take  
> forever to compress.

Sure, it's resource-intensive, but why not offer it?  I've got a Core
2 Duo, and in tests using dar (similar to tar) with per-file bzip2
compression, it performs well enough and has a space savings in the
neighborhood of 20% over gzip on my test set.

It will certainly not be right for everyone, but why not the choice?

-- John

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread Nils Breunese (Lemonbit)
John Goerzen wrote:

> First, the on-disk compression format makes me nervous.  It appears to
> use the deflate algorithm, but cannot be unpacked with either gzip or
> unzip.  It would seem that the few bytes that adding a gzip header
> means would be well worth it, since it would buy the ability to
> extract it without using specialized tools.

You can use the BackupPC_zcat binary to decompress individual files  
manually.

> Secondly, I would love to be able to use bzip2 for the on-disk
> compression of each backup.  It appears that bzip2 can be used for
> archives, but not the regular backups.  Is this in the works anywhere?

In my experience bzip2 is a much more resource intensive algorithm  
than gzip. I wouldn't want to use that on my backups, they'd take  
forever to compress.

Nils Breunese.

-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] Questions about compression in BackupPC

2008-11-05 Thread John Goerzen
Hi everyone,

I installed BackupPC to try it out for backing up Linux systems, and I
have a few questions about it.

First, the on-disk compression format makes me nervous.  It appears to
use the deflate algorithm, but cannot be unpacked with either gzip or
unzip.  It would seem that the few bytes that adding a gzip header
means would be well worth it, since it would buy the ability to
extract it without using specialized tools.  It also makes me nervous
because it isn't a completely off-the-shelf implementation, and
doesn't appear to store a CRC in the file; is there integrity checking
anywhere?

Secondly, I would love to be able to use bzip2 for the on-disk
compression of each backup.  It appears that bzip2 can be used for
archives, but not the regular backups.  Is this in the works anywhere?

Third, I'm wondering how well BackupPC deals with sparse files.  I
notice that the examples for tar are not giving -S to detect sparse
files.  Does BackupPC store sparse files efficiently, even if not
using compression?  On restoration, what will it do with sparse files
-- will it re-create the holes?

Thanks,

-- John


-
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/