SIGSTOP on the client side will cause the rsync job to fail, but SIGSTOP on
the server side should not cause a failure.
On Thu, Jun 19, 2008 at 4:38 AM, mark k <[EMAIL PROTECTED]> wrote:
> Unfortunately a SIGSTOP and SIGCONT will not work, after a SIGCONT the
> rsync job just dies.
>
> I did however have success with splitting the job up on the large
> volume into 15 smaller jobs, was able to complete the backup well
> within the backup window.
>
> MSK
>
> On Wed, Jun 18, 2008 at 11:36 AM, mark k <[EMAIL PROTECTED]> wrote:
> > I have set up a backup job that way, and split it up into 15
> > individual jobs, it is still a lot of files though.
> >
> > I am going to try out sending a SIGSTOP and a SIGCONT to the rsync job
> > on the client and see if it works. If it does, I don't think that
> > would be very hard to incorporate into BackupPC interface.
> >
> >
> >
> > On Wed, Jun 18, 2008 at 11:26 AM, Les Mikesell <[EMAIL PROTECTED]>
> wrote:
> >> mark k wrote:
> >>>
> >>> Wondering if anyone one has found away to pause or suspend a backup
> >>> job instead of stopping
> >>>
> >>> I am backing up several servers with large luns 750 gb to 2tb in size.
> >>> With millions of tiny files.
> >>>
> >>> So everytime a new rsync job kicks off it has to rebuild the file list
> >>> and the only time the backup job completes is on the weekends, when
> >>> the jobs can run 24/7
> >>> can only run about 8 hours a day on the weekdays.
> >>>
> >>> Is there a way to pause the rsync on client and then resume it later,
> >>> so that at least it will get a base full over a couple of nights and
> >>> then incrementals should work normally after that.
> >>
> >> I don't think this is possible, but you might want to look at the file
> >> distribution on the target. If you could split this into some number of
> >> directories that are backed up separately and perhaps a catch-all run
> that
> >> excludes the ones backed up individually it might go a lot faster.
> Also, it
> >> might help a lot to add RAM to the server or run fewer concurrent jobs
> if it
> >> is swapping due to the size of the directory. If you have a current
> version
> >> of backuppc, it should save partial full runs and accumulate parts until
> it
> >> is completed. However, even incrementals will have to transfer the
> entire
> >> directory structure over before starting so it may continue to be a
> problem.
> >>
> >> --
> >> Les Mikesell
> >> [EMAIL PROTECTED]
> >>
> >>
> >>
> >
> >
> >
> > --
> > Walt Disney - "I love Mickey Mouse more than any woman I have ever
> known."
> >
>
>
>
> --
> Fran Lebowitz - "You're only has good as your last haircut."
>
> -------------------------------------------------------------------------
> Check out the new SourceForge.net Marketplace.
> It's the best place to buy or sell services for
> just about anything Open Source.
> http://sourceforge.net/services/buy/index.php
> _______________________________________________
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki: http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
-------------------------------------------------------------------------
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services for
just about anything Open Source.
http://sourceforge.net/services/buy/index.php
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/