Hi,
        option 0 is the best one however there are financial drawbacks :-) The
whole situation is like this. I have bacula server with two remote SAN
connected drives. SAN does mirroring etc and SAN drives are considered
stable and safe.
I have the backup rotation schema with 1 weekly full backup and 6
differential/incremental backups. I need to backup various routers,
servers, important workstations etc... There is cca 20 clients now.
Total storage needed to be backed up is like 200-250GB. Number of
clients should increase in time.
Since now we used simple ssh+tar solution. I would like to use Bacula to
"tidy up" the whole process and make it more reliable and robust. The
disk space on SAN is expensive and "precious" and I would like to use it
reasonably. So I have no problem with connecting each client after
another, performing full backup to a new volume and deleting old full
backup afterwards - this is how it now works with ssh+tar. Client
connects, backups to temporary file and when backup is complete, old
backup is deleted and temporary file is renamed. Clients are backed up
one after another so I do not need so much overhead disk space. I have
no problem finding reason for extra space for one full backup.
I am still considering other options like spooling to local SATA drive,
or backing up to local drive and synchronizing to SAN drives, but every
solution has some disadvantages...
Full catalog backup will be performed on SAN drives every day and to
remote servers. I find it as the most valuable data to be backed up :-)

Radek


Marek Simon napsal(a):
> My opinion to your ideas:
> 0) Leave the schema as I submited and buy more disk space for backuping. :-)
> 
> 1) It is best variant I think. The other advantage is that the full 
> backup of all clients would take much longer time then 1/7th full and 
> other differential. Now what to do with Catalog:
> You can backup the catalog to some changable media (tape, CD/DWD-RW).
> You can pull the (zipped and may be encrypted) catalog to some or all of 
> your clients.
> You can send your (zipped and maybe encrypted) Catalog to some friend of 
> you (and you can backup his catalog for reciprocation), but it may be a 
> violence of the data privacy (even if the Catalog contain only names and 
> sizes).
> You can forward the bacula messages (completed backups) to some external 
> mail address and then if needed you can reconstruct the job-volume 
> binding from them.
> The complete catalog is too big for sending it by e-mail, but still you 
> can do SQL selection in catalog after the backup and send the job-volume 
> bindings and some other relevant information to the external email 
> address in CSV format.
> Still you can (and I strongly recommend to) backup the catalog every 
> time after the daily bunch of Jobs and extract it when needed with other 
> bacula tools (bextract).
> 
> 2) I thought, you are in lack of disk space, so you can't afford to have 
> the full backup twice plus many differential backups. So I do not see 
> the difference if I have two full backups on a device for a day or for 
> few hours, I need that space anyway. But I think this variant is better 
> to be used it with your original idea: Every full backup volume has its 
> own pool and the Job Schedule is set up to use volume 1 in odd weeks and 
> do the immediate differential (practicaly zero sized) backup to the 
> volume 2 just after the full one and vice-versa in even weeks. 
> Priorities could help you as well in this case. May be some check if the 
> full backup was good would be advisable, but I am not sure if bacula can 
> do this kind of conditional job runs, may be with some python hacking or 
> some After Run and Before Run scripts.
> You can do the same for differential backups - two volumes in two pools, 
> the first is used and the other cleared - in turns.
> And finaly, you can combine it with previous solution and divide it to 
> sevenths or more parts, but then it would be the real Catalog hell.
> 
> 3) It is the worst solution. If you want to have bad sleep every Monday 
> (or else day), try it. It is realy risky to loose the backup even for a 
> while, an accident can strike at any time.
> 
> Marek
> 
> P.S. I could write it in czech, but the other readers can be interested 
> too :-)
> 
> Radek Hladik napsal(a):
>> Hi,
>>      thanks for your answer. Your idea sounds good. However if I understand 
>> it correctly, there will be two full backups for the whole day after 
>> full backup. This is what I am trying to avoid as I will be backing up a 
>> lot of clients. So as I see it I have these possibilities:
>>
>> 1) use your scheme and divide clients into seven groups. One group will 
>> start it's full backup on Monday, second on Tuesday, etd.. So I will 
>> have all the week two full backups for 1/7 clients. This really seems 
>> like I will need to backup the catalog at least dozen times because no 
>> one will be able to deduct which backup is on which volume :-)
>> 2) modify your scheme as there will be another differential backup right 
>> after the full backup before next job starts. It will effectively erase 
>> the last week full backup.
>> 3) use only 7 volumes and retention 6 days and live with the fact, that 
>> there is no backup during backup.
>>
>> Now I only need to decide which option will be the best one :-)
>>
>> Radek
>>
>>   
>>
> 
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems?  Stop.
> Now Search log events and configuration files using AJAX and a browser.
> Download your FREE copy of Splunk now >> http://get.splunk.com/
> _______________________________________________
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to