> "Marco Zancanaro" <marco.za...@gmail.com> kirjoitti viestissä > news:aanlktilg5xz50f0nobugolb_qngdmsa1xvzbpnqjr...@mail.gmail.com... > Hi, i've configured a bacula system in my local network. I need to do the > backup locally for a fast backup and a fast retore, but i also need a > remote backup for safety reason. > I'm think of doing a remote copy of the volume that bacula creates. The > problem is that the volume is very large, about 118 Gb at the moment. And > i need to copy the data in a secure way (btw the data is already encrypted > with bacula PKI function). I don't think that i can do any type of > incremental backup with this volume, because it's one big file that > changes every day. > My idea is to use bacula with a specific job that use a different storage. > This storage is a mount point for the remote computer filesystem. I can > introduce here some sort of protection with a vpn or other solution, i > also thik of using bacula tls capabilities but then i'm forced to use tls > in the local network, and this is not necessary. > I've two option: > 1. Do a backup of the volume (the volume contains the backup that bacula > do every day) > 2. Re-do every backup that bacula already did, but this time in the remote > location (i think this is not a good idea) > > > At this point I don't know if i can do this with bacula ... any other > ideas? > > I'm using bacula 2.4.4 with ssl support and i've an ADSL line with > 20Mbits/sec down and 512 Kbits/sec up. On the remote end there is a > computer with similar internet connection. >
So you effectively have 512kbit/s, or 64 kB/s symmetric bandwith. 100 GB over this connection would take over 400 hours, over 2 weeks. Provided the connection is reliable enough, so there would be no fatal timeout errors during this time frame, causing the system to stop running. And, if your connections are dedicated for this purpose, and the bandwith is not significantly needed for some other use. I think your only option is to take care of full backups locally, and thereafter physically move the media to remote location. Then, if your data does not change too much daily, you maybe could run incremental backups to remote storage. But backing up the volume would mean that you have to back up the whole volume (100 GB could be split to smaller chunks). So I think you'd better run separate jobs to remote storage. In 24 hours, 64 kB/s can transfer 5.5 GB. So, if you'll get maybe less than 2 GB new data daily, the incrementals might work this way (assuming you can spend an average of 50% of the bandwith for backups) Note that running incrementals more or less often does not help -it's a relationship between bandwith and the rate you'll get new data. Whether to do this with Bacula or using some other means is only a question after you know your bandwith is adequate. Also remember that a possible restore takes as much time. Since it's about a secondary location for improved security, needing this remote data is probably related to some sort of disaster. So I guess in that hopefully rare occasion you may be able to physically get the remote media on-site. -- TiN ------------------------------------------------------------------------------ This SF.net email is sponsored by Sprint What will you do first with EVO, the first 4G phone? Visit sprint.com/first -- http://p.sf.net/sfu/sprint-com-first _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users