Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-13 Thread Martin Simmons
> On Wed, 12 Oct 2011 21:58:28 -0400, mark bergman said: > > => If you limited the maximum jobs on the FD it would only run one at once, > > That doesn't work, as we backup ~20 small machines in addition to the large (4 > to 8TB) filesystems. Assuming you mean ~20 separate client machines (F

Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-13 Thread Thomas Lohman
> In an effort to work around the fact that bacula kills long-running > jobs, I'm about to partition my backups into smaller sets. For example, > instead of backing up: Since we may end up having jobs that run for more than 6 days, I was pretty curious to see where in the code (release 5.0.3) thi

Re: [Bacula-users] seeking advice re. splitting up large backups --dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-13 Thread Robert.Mortimer
From: James Harper [mailto:james.har...@bendigoit.com.au] > > In an effort to work around the fact that bacula kills long-running jobs, I'm > about to partition my backups into smaller sets. For example, instead of > backing up: > > /home > > I would like to backup the content of /home as

Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-12 Thread Steve Costaras
seems to be a common mis-conception or I'm /much/ luckier than I should be as I routinely run jobs that last over 15-20 days with zero problems (besides them taking 15-20 days. ;) ). I've been doing this for a couple years now end on end with different deployments of bacula (mainly linux/ubuntu

Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-12 Thread John Drescher
> Does Bacula really kill long running jobs? Or are you seeing the effect > of something at layer 3 or below (eg TCP connections timing out in > firewalls)? > I believe it automatically kills jobs that are longer than 5 days or something similar. At least that was discussed recently on the list.

Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-12 Thread mark . bergman
In the message dated: Thu, 13 Oct 2011 11:54:47 +1100, The pithy ruminations from "James Harper" on were: => > => > In an effort to work around the fact that bacula kills long-running => jobs, I'm => > about to partition my backups into smaller sets. For example, instead => of => > backing up: =

Re: [Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-12 Thread James Harper
> > In an effort to work around the fact that bacula kills long-running jobs, I'm > about to partition my backups into smaller sets. For example, instead of > backing up: > > /home > > I would like to backup the content of /home as separate jobs. For example: > > /home/[0-9]* >

[Bacula-users] seeking advice re. splitting up large backups -- dynamic filesets to prevent duplicate jobs and reduce backup time

2011-10-12 Thread mark . bergman
In an effort to work around the fact that bacula kills long-running jobs, I'm about to partition my backups into smaller sets. For example, instead of backing up: /home I would like to backup the content of /home as separate jobs. For example: /home/[0-9]* /home/[A-G]