Re: [Bacula-users] encryption compression

2012-08-30 Thread lst_hoe02

Zitat von Phil Stracchino ala...@metrocast.net:

 On 08/22/12 11:18, lst_ho...@kwsoft.de wrote:
 Zitat von lst_ho...@kwsoft.de:
 according to the manual client based software compression is not
 useful when using tape drives with builtin compression like LTO. Is
 this still true when using data encryption? With this the encrypted
 data are normally not really compresable anymore but a compression on
 the client before encryption could be useful, no?

 Any opinion on this?

 Anyone aware if i'm right that compression is happening *before* encryption?

 It's certainly *supposed* to.  Encryption works better when redundancy
 is eliminated from the cleartext before encryption, plus there's a
 smaller volume of data to encrypt.

On inspecting the problem a bit closer i found that openssl  
encryption itself is able to do compression because of the reason for  
better encryption and it even is the default to do so. Can anyone  
confirm that Bacula encryption let openssl also do compression?

Regards

Andreas



--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] encryption compression

2012-08-22 Thread lst_hoe02

Zitat von lst_ho...@kwsoft.de:

 Hello

 according to the manual client based software compression is not
 useful when using tape drives with builtin compression like LTO. Is
 this still true when using data encryption? With this the encrypted
 data are normally not really compresable anymore but a compression on
 the client before encryption could be useful, no?

 Any opinion on this?

Anyone aware if i'm right that compression is happening *before* encryption?

Regards

Andreas



--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] encryption compression

2012-08-22 Thread Phil Stracchino
On 08/22/12 11:18, lst_ho...@kwsoft.de wrote:
 Zitat von lst_ho...@kwsoft.de:
 according to the manual client based software compression is not
 useful when using tape drives with builtin compression like LTO. Is
 this still true when using data encryption? With this the encrypted
 data are normally not really compresable anymore but a compression on
 the client before encryption could be useful, no?

 Any opinion on this?
 
 Anyone aware if i'm right that compression is happening *before* encryption?

It's certainly *supposed* to.  Encryption works better when redundancy
is eliminated from the cleartext before encryption, plus there's a
smaller volume of data to encrypt.


-- 
  Phil Stracchino, CDK#2 DoD#299792458 ICBM: 43.5607, -71.355
  ala...@caerllewys.net   ala...@metrocast.net   p...@co.ordinate.org
  Renaissance Man, Unix ronin, Perl hacker, SQL wrangler, Free Stater
 It's not the years, it's the mileage.

--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] encryption compression

2012-08-17 Thread lst_hoe02
Hello

according to the manual client based software compression is not  
useful when using tape drives with builtin compression like LTO. Is  
this still true when using data encryption? With this the encrypted  
data are normally not really compresable anymore but a compression on  
the client before encryption could be useful, no?

Any opinion on this?

Many Thanks

Andreas




--
Live Security Virtual Conference
Exclusive live event will cover all the ways today's security and 
threat landscape has changed and how IT managers can respond. Discussions 
will include endpoint security, mobile security and the latest in malware 
threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-03 Thread Landon Fuller


On Nov 2, 2006, at 08:30, Robert Nelson wrote:


Landon,

I've changed the code so that the encryption code prefixes the data  
block

with a block length prior to encryption.

The decryption code accumulates data until a full data block is  
decrypted

before passing it along to the decompression code.

The code now works for all four scenarios with encryption and  
compression:
none, encryption, compression, and encryption + compression.   
Unfortunately

the code is no longer compatible for previously encrypted backups.

I could add some more code to make the encryption only case work like
before.  However, since this is a new feature in 1.39 and there  
shouldn't be
a lot of existing backups, I would prefer to invalidate the  
previous backups

and keep the code simpler.

Also I think we should have a design rule that says any data  
filters like
encryption, compression, etc must maintain the original buffer  
boundaries.


This will allow us to define arbitrary, dynamically extensible  
filter stacks

in the future.

What do you think?


I was thinking about this on the way to work. My original assumption  
was that Bacula used the zlib streaming API to maintain state during  
file compression/decompression, but this is not the case. Reality is  
something more like this:


Backup:
- Set up the zlib stream context
	- For each file block (not each file), compress the block via deflate 
(stream, Z_FINISH); and reinitialize the stream.
	- After all files (and blocks) are compressed, destroy the stream  
context


Restore:
- For each block, call uncompress(), which does not handle streaming.

This is a unfortunate -- reinitializing the stream for each block  
significantly degrades compression efficiency, as 1) block boundaries  
are dynamic and may be set arbitrarily, 2) the LZ77 algorithm may  
cross block boundaries, referring up to 32k of previous input data.  
(http://www.gzip.org/zlib/rfc-deflate.html#overview), 3) The huffman  
coding context comprises the entire block, 4) There's no need to  
limit zlib block size to bacula's block size.


The next question is this -- as we *should* stream the data, does it  
make sense to enforce downstream block boundaries in the upstream  
filter? I'm siding in favor requiring streaming support, and thus  
allowing the individual filter implementor to worry about their own  
block buffering, since they can far better encapsulate necessary  
state and implementation -- and most already do.


The one other thing I am unsure of is whether the zlib streaming API  
correctly handles streams that have been written as per above -- each  
bacula data block as an independent 'stream'. If zlib DOES handle  
this, it should be possible to modify the backup and restore  
implementation to use the stream API correctly while maintaining  
backwards compatibility. This would fix the encryption problem AND  
increase compression efficiency.


With my extremely large database backups, I sure wouldn't mind  
increased compression efficiency =)


Some documentation on the zlib API is available here (I had a little  
difficulty googling this):
	http://www.freestandards.org/spec/booksets/LSB-Core-generic/LSB-Core- 
generic/libzman.html


Cheers,
Landon


PGP.sig
Description: This is a digitally signed message part
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-03 Thread Landon Fuller


On Nov 2, 2006, at 13:22, Robert Nelson wrote:

The problem is that currently there are three filters defined:  
compression,

encryption, and sparse file handling.  The current implementation of
compression and sparse file handling both require block boundary
preservation.  Even if zlib streaming could handle the existing  
block based

data, sparse file handling would be broken.


That's true. It's also not possible to make it handle streaming. Bummer.
Block-preserving it is, then.

The stream implementation could really use a refactor, coupled with a  
more modular filter API, but that's not something I'll have time for  
anytime soon.


-landonf




PGP.sig
Description: This is a digitally signed message part
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-03 Thread Landon Fuller


On Nov 1, 2006, at 23:25, Michael Brennen wrote:


On Wed, 1 Nov 2006, Robert Nelson wrote:

On top of the issue with the reversed processing during restore  
that I
previously mentioned, there is a fundamental flaw in the  
processing of
compressed+gzipped data.  The problem is that boundaries aren't  
preserved

across encrypt/decrypt.

What happens is that after the block is compressed it is  
encrypted.  However
since the encryption engine processes data in blocks there may  
still be
bytes from the compressed block in the pipeline when the block is  
sent to
the Storage Daemon.  As a result, when the same block is decrypted  
it may

result in only part of the compressed block.

Unfortunately there is no way to tell how much decrypted data is  
required by
the decompression engine with the current design.  I think the  
algorithm
would have to be changed to pass along the compressed data size  
with each

compressed block.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of  
Landon

Fuller

The encryption does not include compression -- It made more sense  
to piggyback on the existing compression code. Also, thanks for  
catching this! I'm embarrassed that I forgot to test backup 
+restore with both compression and encryption enabled.


Landon, does it make sense to use OpenSSL compression in lieu of  
Bacula's compression, such that one should use one or the other but  
not both?  I have no idea how good OpenSSL's internal compression  
is, but that might be a straightforward way around what sounds like  
a block cipher issue???


Implementing an OpenSSL zlib BIO would solve the issue. However, I  
think it'd be a shame to have code in two places doing the same thing  
-- this should be possible to fix correctly in Bacula's zlib code alone.


-landonf


PGP.sig
Description: This is a digitally signed message part
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-03 Thread Kern Sibbald

 Landon,

 I've changed the code so that the encryption code prefixes the data block
 with a block length prior to encryption.

 The decryption code accumulates data until a full data block is decrypted
 before passing it along to the decompression code.

 The code now works for all four scenarios with encryption and compression:
 none, encryption, compression, and encryption + compression.
 Unfortunately
 the code is no longer compatible for previously encrypted backups.

 I could add some more code to make the encryption only case work like
 before.  However, since this is a new feature in 1.39 and there shouldn't
 be
 a lot of existing backups, I would prefer to invalidate the previous
 backups
 and keep the code simpler.

 Also I think we should have a design rule that says any data filters like
 encryption, compression, etc must maintain the original buffer boundaries.

 This will allow us to define arbitrary, dynamically extensible filter
 stacks
 in the future.

 What do you think?

I'm unfortuntely not in a good position to examine this problem in detail,
but I suggest that we should do our best to keep the old data readable by
any kludge necessary.

One possible solution for the new code that you have implemented is to put
the new compressed data in a new stream -- i.e. a different one from the
old compressed data -- this could possibly allow old Volumes to be read
and any new data written to Volumes will be written correctly.

One thing to be very careful about is to make sure the length that you
store is bigendian-littlendian independent. Probably you have already done
this, but if not you need to use the serialization code that is also used
for sparse file length.



 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Landon
 Fuller
 Sent: Wednesday, November 01, 2006 7:08 PM
 To: Michael Brennen
 Cc: bacula-users@lists.sourceforge.net
 Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


 On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:

 On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data
 should
 no longer be compressible.

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to
 encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

 Landon, what is your take on this?  Since you wrote the code you
 seem to be
 the best source on whether the openssl functions you are using
 compress data.

 Howdy,

 The encryption does not include compression -- It made more sense to
 piggyback on the existing compression code.
 Also, thanks for catching this! I'm embarrassed that I forgot to test
 backup+restore with both compression and encryption enabled.

 -landonf



 -
 Using Tomcat but need to do more? Need to support web services, security?
 Get stuff done quickly with pre-integrated technology to make your job
 easier
 Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
 http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users



Best regards, Kern

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-03 Thread Robert Nelson
This code is backwards compatible for everything except encrypted data.
Previously compressed backups will still work fine.

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Kern
Sibbald
Sent: Friday, November 03, 2006 4:15 PM
To: Robert Nelson
Cc: [EMAIL PROTECTED]; 'Landon Fuller';
bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


 Landon,

 I've changed the code so that the encryption code prefixes the data block
 with a block length prior to encryption.

 The decryption code accumulates data until a full data block is decrypted
 before passing it along to the decompression code.

 The code now works for all four scenarios with encryption and compression:
 none, encryption, compression, and encryption + compression.
 Unfortunately
 the code is no longer compatible for previously encrypted backups.

 I could add some more code to make the encryption only case work like
 before.  However, since this is a new feature in 1.39 and there shouldn't
 be
 a lot of existing backups, I would prefer to invalidate the previous
 backups
 and keep the code simpler.

 Also I think we should have a design rule that says any data filters like
 encryption, compression, etc must maintain the original buffer boundaries.

 This will allow us to define arbitrary, dynamically extensible filter
 stacks
 in the future.

 What do you think?

I'm unfortuntely not in a good position to examine this problem in detail,
but I suggest that we should do our best to keep the old data readable by
any kludge necessary.

One possible solution for the new code that you have implemented is to put
the new compressed data in a new stream -- i.e. a different one from the
old compressed data -- this could possibly allow old Volumes to be read
and any new data written to Volumes will be written correctly.

One thing to be very careful about is to make sure the length that you
store is bigendian-littlendian independent. Probably you have already done
this, but if not you need to use the serialization code that is also used
for sparse file length.



 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Landon
 Fuller
 Sent: Wednesday, November 01, 2006 7:08 PM
 To: Michael Brennen
 Cc: bacula-users@lists.sourceforge.net
 Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


 On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:

 On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data
 should
 no longer be compressible.

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to
 encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

 Landon, what is your take on this?  Since you wrote the code you
 seem to be
 the best source on whether the openssl functions you are using
 compress data.

 Howdy,

 The encryption does not include compression -- It made more sense to
 piggyback on the existing compression code.
 Also, thanks for catching this! I'm embarrassed that I forgot to test
 backup+restore with both compression and encryption enabled.

 -landonf



 -
 Using Tomcat but need to do more? Need to support web services, security?
 Get stuff done quickly with pre-integrated technology to make your job
 easier
 Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
 http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
 ___
 Bacula-users mailing list
 Bacula-users@lists.sourceforge.net
 https://lists.sourceforge.net/lists/listinfo/bacula-users



Best regards, Kern

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job
easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users



-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642

Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Alan Brown
On Wed, 1 Nov 2006, Arno Lehmann wrote:

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data.

If the encryption routines also contain compression routines.

 This is completely unverified and refers to encryption
 programs that are rather outdated by now, though...

Assuming you mean PGP: IIRC this made use of gzip to compress the input 
before applying crypto, partly to somewhat increase entropy before hitting 
the encryption routines.

It was possible to give PGP flags to not compress input and this was 
essential when encrypting already-compressed data in order to prevent
output file bloat.

AB


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Robert Nelson
Landon,

I've changed the code so that the encryption code prefixes the data block
with a block length prior to encryption.

The decryption code accumulates data until a full data block is decrypted
before passing it along to the decompression code.

The code now works for all four scenarios with encryption and compression:
none, encryption, compression, and encryption + compression.  Unfortunately
the code is no longer compatible for previously encrypted backups.

I could add some more code to make the encryption only case work like
before.  However, since this is a new feature in 1.39 and there shouldn't be
a lot of existing backups, I would prefer to invalidate the previous backups
and keep the code simpler.  

Also I think we should have a design rule that says any data filters like
encryption, compression, etc must maintain the original buffer boundaries.

This will allow us to define arbitrary, dynamically extensible filter stacks
in the future.

What do you think?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Landon
Fuller
Sent: Wednesday, November 01, 2006 7:08 PM
To: Michael Brennen
Cc: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:

 On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data  
 should
 no longer be compressible.

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to  
 encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

 Landon, what is your take on this?  Since you wrote the code you  
 seem to be
 the best source on whether the openssl functions you are using  
 compress data.

Howdy,

The encryption does not include compression -- It made more sense to  
piggyback on the existing compression code.
Also, thanks for catching this! I'm embarrassed that I forgot to test  
backup+restore with both compression and encryption enabled.

-landonf



-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread novosirj
I think that it's fine -- as was just discussed on the list, the expectation 
was that the encryption format was likely to change prior to 1.4.0.

Maybe there are conversion options for those users who have written the old 
format (that don't involve shoehorning ir into the app itself)?

-Original Message-

From:  Robert Nelson [EMAIL PROTECTED]
Subj:  Re: [Bacula-users] Encryption/Compression Conflict in CVS
Date:  Thu Nov 2, 2006 11:30 am
Size:  2K
To:  'Landon Fuller' [EMAIL PROTECTED], 'Michael Brennen' [EMAIL PROTECTED]
cc:  [EMAIL PROTECTED], bacula-users@lists.sourceforge.net

Landon,

I've changed the code so that the encryption code prefixes the data block
with a block length prior to encryption.

The decryption code accumulates data until a full data block is decrypted
before passing it along to the decompression code.

The code now works for all four scenarios with encryption and compression:
none, encryption, compression, and encryption + compression.  Unfortunately
the code is no longer compatible for previously encrypted backups.

I could add some more code to make the encryption only case work like
before.  However, since this is a new feature in 1.39 and there shouldn't be
a lot of existing backups, I would prefer to invalidate the previous backups
and keep the code simpler.  

Also I think we should have a design rule that says any data filters like
encryption, compression, etc must maintain the original buffer boundaries.

This will allow us to define arbitrary, dynamically extensible filter stacks
in the future.

What do you think?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Landon
Fuller
Sent: Wednesday, November 01, 2006 7:08 PM
To: Michael Brennen
Cc: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:

 On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data  
 should
 no longer be compressible.

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to  
 encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

 Landon, what is your take on this?  Since you wrote the code you  
 seem to be
 the best source on whether the openssl functions you are using  
 compress data.

Howdy,

The encryption does not include compression -- It made more sense to  
piggyback on the existing compression code.
Also, thanks for catching this! I'm embarrassed that I forgot to test  
backup+restore with both compression and encryption enabled.

-landonf



-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Michael Brennen
On Thursday 02 November 2006 10:30, Robert Nelson wrote:

 The code now works for all four scenarios with encryption and compression:
 none, encryption, compression, and encryption + compression.  Unfortunately
 the code is no longer compatible for previously encrypted backups.

Excellent.  Is this committed to CVS yet?  I see that src/filed/restore.c 
changed.  When you have it committed let me know and I will run some tests 
here as well.

 I could add some more code to make the encryption only case work like
 before.  However, since this is a new feature in 1.39 and there shouldn't
 be a lot of existing backups, I would prefer to invalidate the previous
 backups and keep the code simpler.

FWIW I concur.  I would rather restart my archive pool (again, this is still 
testing after all... right?!?!? :) and set for the long term.

-- 

   -- Michael

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Robert Nelson
The problem is that currently there are three filters defined: compression,
encryption, and sparse file handling.  The current implementation of
compression and sparse file handling both require block boundary
preservation.  Even if zlib streaming could handle the existing block based
data, sparse file handling would be broken.

-Original Message-
From: Landon Fuller [mailto:[EMAIL PROTECTED] 
Sent: Thursday, November 02, 2006 11:06 AM
To: Robert Nelson
Cc: 'Michael Brennen'; [EMAIL PROTECTED];
bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


On Nov 2, 2006, at 08:30, Robert Nelson wrote:

 Landon,

 I've changed the code so that the encryption code prefixes the data 
 block with a block length prior to encryption.

 The decryption code accumulates data until a full data block is 
 decrypted before passing it along to the decompression code.

 The code now works for all four scenarios with encryption and
 compression:
 none, encryption, compression, and encryption + compression.   
 Unfortunately
 the code is no longer compatible for previously encrypted backups.

 I could add some more code to make the encryption only case work like 
 before.  However, since this is a new feature in 1.39 and there 
 shouldn't be a lot of existing backups, I would prefer to invalidate 
 the previous backups and keep the code simpler.

 Also I think we should have a design rule that says any data filters 
 like encryption, compression, etc must maintain the original buffer 
 boundaries.

 This will allow us to define arbitrary, dynamically extensible filter 
 stacks in the future.

 What do you think?

I was thinking about this on the way to work. My original assumption was
that Bacula used the zlib streaming API to maintain state during file
compression/decompression, but this is not the case. Reality is something
more like this:

Backup:
- Set up the zlib stream context
- For each file block (not each file), compress the block via
deflate (stream, Z_FINISH); and reinitialize the stream.
- After all files (and blocks) are compressed, destroy the stream
context

Restore:
- For each block, call uncompress(), which does not handle
streaming.

This is a unfortunate -- reinitializing the stream for each block
significantly degrades compression efficiency, as 1) block boundaries are
dynamic and may be set arbitrarily, 2) the LZ77 algorithm may cross block
boundaries, referring up to 32k of previous input data.  
(http://www.gzip.org/zlib/rfc-deflate.html#overview), 3) The huffman coding
context comprises the entire block, 4) There's no need to limit zlib block
size to bacula's block size.

The next question is this -- as we *should* stream the data, does it make
sense to enforce downstream block boundaries in the upstream filter? I'm
siding in favor requiring streaming support, and thus allowing the
individual filter implementor to worry about their own block buffering,
since they can far better encapsulate necessary state and implementation --
and most already do.

The one other thing I am unsure of is whether the zlib streaming API
correctly handles streams that have been written as per above -- each bacula
data block as an independent 'stream'. If zlib DOES handle this, it should
be possible to modify the backup and restore implementation to use the
stream API correctly while maintaining backwards compatibility. This would
fix the encryption problem AND increase compression efficiency.

With my extremely large database backups, I sure wouldn't mind increased
compression efficiency =)

Some documentation on the zlib API is available here (I had a little
difficulty googling this):

http://www.freestandards.org/spec/booksets/LSB-Core-generic/LSB-Core-generic
/libzman.html

Cheers,
Landon



-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Arno Lehmann
Hi,

On 11/2/2006 12:20 PM, Alan Brown wrote:
 On Wed, 1 Nov 2006, Arno Lehmann wrote:
 
 
Not if compression happens prior to encryption. :)

Theoretically - yes, but I'm quite sure that encryption usually also
compresses data.
 
 
 If the encryption routines also contain compression routines.
 
 
This is completely unverified and refers to encryption
programs that are rather outdated by now, though...
 
 
 Assuming you mean PGP: IIRC this made use of gzip to compress the input 
 before applying crypto, partly to somewhat increase entropy before hitting 
 the encryption routines.

Might have been PGP, right... anyway, the reasoning behind it - to 
create higher entropy - seems such a good idea that I assumed this was a 
common aproach to encryption. Seems like I was wrong, concerning some 
comments here :-)

All things considered, it seems best to keep (or rather, to fix) the 
current behaviour where the user configures compression independently of 
encryption.

Arno

-- 
IT-Service Lehmann[EMAIL PROTECTED]
Arno Lehmann  http://www.its-lehmann.de

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Robert Nelson
Landon,

Since you are the owner of the crypto code I'll leave it up to you to decide
how you want to handle this.  I've attached a patch for the code I wrote to
make the encryption code preserve the block boundaries.  If it is useful
great, if not that's okay too, since I got to explore a whole section of the
Bacula code I hadn't played with before. :-).

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Landon
Fuller
Sent: Wednesday, November 01, 2006 7:08 PM
To: Michael Brennen
Cc: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:

 On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data  
 should
 no longer be compressible.

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to  
 encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

 Landon, what is your take on this?  Since you wrote the code you  
 seem to be
 the best source on whether the openssl functions you are using  
 compress data.

Howdy,

The encryption does not include compression -- It made more sense to  
piggyback on the existing compression code.
Also, thanks for catching this! I'm embarrassed that I forgot to test  
backup+restore with both compression and encryption enabled.

-landonf


bacula-061102.patch
Description: Binary data
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-02 Thread Robert Nelson
In that case, would you like me to commit the code I have?

I agree about reworking the stream implementation.  The existing code could
be written as a number of filters: gzip, openssl, sparse, block/deblock.
With a well defined API you would be able to define new stream types in the
configuration files using these filters in addition to external filters
located in shared libraries.

-Original Message-
From: Landon Fuller [mailto:[EMAIL PROTECTED] 
Sent: Thursday, November 02, 2006 3:38 PM
To: Robert Nelson
Cc: 'Michael Brennen'; [EMAIL PROTECTED];
bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


On Nov 2, 2006, at 13:22, Robert Nelson wrote:

 The problem is that currently there are three filters defined:  
 compression,
 encryption, and sparse file handling.  The current implementation of
 compression and sparse file handling both require block boundary
 preservation.  Even if zlib streaming could handle the existing  
 block based
 data, sparse file handling would be broken.

That's true. It's also not possible to make it handle streaming. Bummer.
Block-preserving it is, then.

The stream implementation could really use a refactor, coupled with a  
more modular filter API, but that's not something I'll have time for  
anytime soon.

-landonf





-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Arno Lehmann
Hi,

On 11/1/2006 5:43 AM, Michael Brennen wrote:
 I posted a couple of days ago that restoring files from 1.39.27 
 (current CVS) with both encryption and compression turned on 
 resulted in 0 length files being restored.
 
 I was able to test that further tonight by archiving and restoring a 
 file in the 4 combinations of encryption/compression off/on.
 
 Running neither, compression alone or encryption alone I was able to 
 archive and restore a file correctly.  Running the two together I 
 was able to reproduce the problem of 0 length restores, with no 
 apparent errors.
 
 So... in my testing the combination of encryption and compression is 
 either not writing files correctly to tape (in which case there is a 
 lot of tape space taken up needlessly :) or the files are being 
 corrupted in the restore process; it appears to me to be the latter.

This sounds like compression should be automatically disabled when 
encrypton is enabled. Should be useless anyway as encrypted data should 
no longer be compressible.

Arno

-- 
IT-Service Lehmann[EMAIL PROTECTED]
Arno Lehmann  http://www.its-lehmann.de

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Robert Nelson
Actually this bug is quite simple to fix.  The problem is that the backup
and restore both do them in the same order instead of inverting the order on
restore.

Current Code:
compress - encrypt - decompress - decrypt

It should be:
compress - encrypt - decrypt - decompress

I can change the restore order so that existing backups will become readable
and new backups will work whether created by the old software or the new.

I'll commit the fix once I've finished testing it. 

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Arno
Lehmann
Sent: Wednesday, November 01, 2006 3:43 AM
To: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS

Hi,

On 11/1/2006 5:43 AM, Michael Brennen wrote:
 I posted a couple of days ago that restoring files from 1.39.27 
 (current CVS) with both encryption and compression turned on 
 resulted in 0 length files being restored.
 
 I was able to test that further tonight by archiving and restoring a 
 file in the 4 combinations of encryption/compression off/on.
 
 Running neither, compression alone or encryption alone I was able to 
 archive and restore a file correctly.  Running the two together I 
 was able to reproduce the problem of 0 length restores, with no 
 apparent errors.
 
 So... in my testing the combination of encryption and compression is 
 either not writing files correctly to tape (in which case there is a 
 lot of tape space taken up needlessly :) or the files are being 
 corrupted in the restore process; it appears to me to be the latter.

This sounds like compression should be automatically disabled when 
encrypton is enabled. Should be useless anyway as encrypted data should 
no longer be compressible.

Arno

-- 
IT-Service Lehmann[EMAIL PROTECTED]
Arno Lehmann  http://www.its-lehmann.de

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job
easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users



-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Michael Brennen
On Wednesday 01 November 2006 05:43, Arno Lehmann wrote:

  So... in my testing the combination of encryption and compression is
  either not writing files correctly to tape (in which case there is a
  lot of tape space taken up needlessly :) or the files are being
  corrupted in the restore process; it appears to me to be the latter.

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data should
 no longer be compressible.

Not if compression happens prior to encryption. :)

-- 

   -- Michael

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Michael Brennen
On Wednesday 01 November 2006 06:04, Robert Nelson wrote:

 Actually this bug is quite simple to fix.  The problem is that the backup
 and restore both do them in the same order instead of inverting the order
 on restore.

 Current Code:
  compress - encrypt - decompress - decrypt

 It should be:
  compress - encrypt - decrypt - decompress

 I can change the restore order so that existing backups will become
 readable and new backups will work whether created by the old software or
 the new.

I wondered if it weren't something exactly that simple. :)

 I'll commit the fix once I've finished testing it.

Thanks much, I will test it when you have committed it.

 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Arno
 Lehmann
 Sent: Wednesday, November 01, 2006 3:43 AM
 To: bacula-users@lists.sourceforge.net
 Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS

 Hi,

 On 11/1/2006 5:43 AM, Michael Brennen wrote:
  I posted a couple of days ago that restoring files from 1.39.27
  (current CVS) with both encryption and compression turned on
  resulted in 0 length files being restored.
 
  I was able to test that further tonight by archiving and restoring a
  file in the 4 combinations of encryption/compression off/on.
 
  Running neither, compression alone or encryption alone I was able to
  archive and restore a file correctly.  Running the two together I
  was able to reproduce the problem of 0 length restores, with no
  apparent errors.
 
  So... in my testing the combination of encryption and compression is
  either not writing files correctly to tape (in which case there is a
  lot of tape space taken up needlessly :) or the files are being
  corrupted in the restore process; it appears to me to be the latter.

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data should
 no longer be compressible.

 Arno

-- 

   -- Michael

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Arno Lehmann
Hi,

On 11/1/2006 6:00 PM, Michael Brennen wrote:
 On Wednesday 01 November 2006 05:43, Arno Lehmann wrote:
 
 
So... in my testing the combination of encryption and compression is
either not writing files correctly to tape (in which case there is a
lot of tape space taken up needlessly :) or the files are being
corrupted in the restore process; it appears to me to be the latter.

This sounds like compression should be automatically disabled when
encrypton is enabled. Should be useless anyway as encrypted data should
no longer be compressible.
 
 
 Not if compression happens prior to encryption. :)

Theoretically - yes, but I'm quite sure that encryption usually also 
compresses data. This is completely unverified and refers to encryption 
programs that are rather outdated by now, though...

But I suppose you could inform us if encryption in Bacula also 
compresses :-)

Arno


-- 
IT-Service Lehmann[EMAIL PROTECTED]
Arno Lehmann  http://www.its-lehmann.de

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Michael Brennen
On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data should
 no longer be compressible.
 
  Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

Landon, what is your take on this?  Since you wrote the code you seem to be 
the best source on whether the openssl functions you are using compress data.

-- 

   -- Michael

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Michael Brennen
On Wed, 1 Nov 2006, Landon Fuller wrote:

 Landon, what is your take on this?  Since you wrote the code you 
 seem to be the best source on whether the openssl functions you 
 are using compress data.

 The encryption does not include compression -- It made more sense 
 to piggyback on the existing compression code. Also, thanks for 
 catching this! I'm embarrassed that I forgot to test 
 backup+restore with both compression and encryption enabled.

No problem, thanks for contributing the code. :)

-- Michael

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Landon Fuller


On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:


On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:


This sounds like compression should be automatically disabled when
encrypton is enabled. Should be useless anyway as encrypted data  
should

no longer be compressible.


Not if compression happens prior to encryption. :)


Theoretically - yes, but I'm quite sure that encryption usually also
compresses data. This is completely unverified and refers to  
encryption

programs that are rather outdated by now, though...

But I suppose you could inform us if encryption in Bacula also
compresses :-)


Landon, what is your take on this?  Since you wrote the code you  
seem to be
the best source on whether the openssl functions you are using  
compress data.


Howdy,

The encryption does not include compression -- It made more sense to  
piggyback on the existing compression code.
Also, thanks for catching this! I'm embarrassed that I forgot to test  
backup+restore with both compression and encryption enabled.


-landonf


PGP.sig
Description: This is a digitally signed message part
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Robert Nelson
On top of the issue with the reversed processing during restore that I
previously mentioned, there is a fundamental flaw in the processing of
compressed+gzipped data.  The problem is that boundaries aren't preserved
across encrypt/decrypt.

What happens is that after the block is compressed it is encrypted.  However
since the encryption engine processes data in blocks there may still be
bytes from the compressed block in the pipeline when the block is sent to
the Storage Daemon.  As a result, when the same block is decrypted it may
result in only part of the compressed block.

Unfortunately there is no way to tell how much decrypted data is required by
the decompression engine with the current design.  I think the algorithm
would have to be changed to pass along the compressed data size with each
compressed block.

Comments?

-Original Message-
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED] On Behalf Of Landon
Fuller
Sent: Wednesday, November 01, 2006 7:08 PM
To: Michael Brennen
Cc: bacula-users@lists.sourceforge.net
Subject: Re: [Bacula-users] Encryption/Compression Conflict in CVS


On Nov 1, 2006, at 2:20 PM, Michael Brennen wrote:

 On Wednesday 01 November 2006 15:33, Arno Lehmann wrote:

 This sounds like compression should be automatically disabled when
 encrypton is enabled. Should be useless anyway as encrypted data  
 should
 no longer be compressible.

 Not if compression happens prior to encryption. :)

 Theoretically - yes, but I'm quite sure that encryption usually also
 compresses data. This is completely unverified and refers to  
 encryption
 programs that are rather outdated by now, though...

 But I suppose you could inform us if encryption in Bacula also
 compresses :-)

 Landon, what is your take on this?  Since you wrote the code you  
 seem to be
 the best source on whether the openssl functions you are using  
 compress data.

Howdy,

The encryption does not include compression -- It made more sense to  
piggyback on the existing compression code.
Also, thanks for catching this! I'm embarrassed that I forgot to test  
backup+restore with both compression and encryption enabled.

-landonf



-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


Re: [Bacula-users] Encryption/Compression Conflict in CVS

2006-11-01 Thread Michael Brennen
On Wed, 1 Nov 2006, Robert Nelson wrote:

 On top of the issue with the reversed processing during restore that I
 previously mentioned, there is a fundamental flaw in the processing of
 compressed+gzipped data.  The problem is that boundaries aren't preserved
 across encrypt/decrypt.

 What happens is that after the block is compressed it is encrypted.  However
 since the encryption engine processes data in blocks there may still be
 bytes from the compressed block in the pipeline when the block is sent to
 the Storage Daemon.  As a result, when the same block is decrypted it may
 result in only part of the compressed block.

 Unfortunately there is no way to tell how much decrypted data is required by
 the decompression engine with the current design.  I think the algorithm
 would have to be changed to pass along the compressed data size with each
 compressed block.

 -Original Message-
 From: [EMAIL PROTECTED]
 [mailto:[EMAIL PROTECTED] On Behalf Of Landon
 Fuller

 The encryption does not include compression -- It made more sense 
 to piggyback on the existing compression code. Also, thanks for 
 catching this! I'm embarrassed that I forgot to test 
 backup+restore with both compression and encryption enabled.

Landon, does it make sense to use OpenSSL compression in lieu of 
Bacula's compression, such that one should use one or the other but 
not both?  I have no idea how good OpenSSL's internal compression 
is, but that might be a straightforward way around what sounds like 
a block cipher issue???

-- Michael

-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


[Bacula-users] Encryption/Compression Conflict in CVS

2006-10-31 Thread Michael Brennen

I posted a couple of days ago that restoring files from 1.39.27 
(current CVS) with both encryption and compression turned on 
resulted in 0 length files being restored.

I was able to test that further tonight by archiving and restoring a 
file in the 4 combinations of encryption/compression off/on.

Running neither, compression alone or encryption alone I was able to 
archive and restore a file correctly.  Running the two together I 
was able to reproduce the problem of 0 length restores, with no 
apparent errors.

So... in my testing the combination of encryption and compression is 
either not writing files correctly to tape (in which case there is a 
lot of tape space taken up needlessly :) or the files are being 
corrupted in the restore process; it appears to me to be the latter.

This is on a Linux Centos 4.4 system, dir/sd/fd running on the same 
system.  The relevant configurations are below.

-- Michael


FileSet {
   Name = Somehost-Archive
   Include {
 Options {
   signature = MD5
   compression = GZIP6
 }
 File = ...
}

FileDaemon {
   Name = somehost-fd
   FDport = 9102
   WorkingDirectory = /path/to/bacula/working
   Pid Directory = /path/to/bacula/working
   Maximum Concurrent Jobs = 20

   PKI Signatures = Yes
   PKI Encryption = Yes
   PKI Keypair = /path/to/bacula/keys/key.pem
}




-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-integrated technology to make your job easier
Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo
http://sel.as-us.falkag.net/sel?cmd=lnkkid=120709bid=263057dat=121642
___
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users