Sven K�hler wrote:
before i compress my backups with gzip, they are about 20MB, afterwards they are only 2MB (OK, i only have small database :-) ).

I know what you mean, my tiny database backup is 162 MB, but when it's gzip compressed it's 31-43 MB.



there are many gzip-libraries, so gzip could be directly linked into SAPDB if required, but a simple "pipe" would do for me personally.

I have a backup wrapper script that does a backup and then compresses the data afterwards, this has the nice advantage of getting the backup done in a hurry (so the users don't start to complain).


It might be a good idea to support piping the backup data to another program as an option as you would save the extra diskspace for the uncompressed backup and on a multicpu machine I'd expect to have at least one idle cpu that can do the gzip compression concurrently with sapdb extracting the data (which I guess is mostly io bound anyway).

I'd say that the best possible solution is to simply support piping data to a process, that way the administrator can descide how to actually store the data, like driving a bank of dvd burners directly, manipulating a taperobot or simply compressing the data and storing it in /backups with a sane name.

--
 Regards Flemming Frandsen - http://dion.swamp.dk
 PartyTicket.Net co founder & Yet Another Perl Hacker

_______________________________________________
sapdb.general mailing list
[EMAIL PROTECTED]
http://listserv.sap.com/mailman/listinfo/sapdb.general

Reply via email to