On 12/04/16 15:41, Douglas K, Rand wrote:
> For disaster recovery I'm pushing our catalog (along with Bareos configs
> & keys) out to S3. I'm doing this as a Run After Job script and pushing
> the catalog to S3 via s3cmd.
> 
> My problem is that the catalog is large and takes about 10 hours to push
> to S3. What I want to do is have the push to S3 happen in the
> backkground and release the Director to move on to other things. But it
> doesn't.

> But the director gets stuck:
> 
> Running Jobs:
> Console connected at 01-Dec-16 15:29
> Console connected at 03-Dec-16 16:07
> Console connected at 04-Dec-16 14:33
>  JobId Level   Name                       Status
> ======================================================================
>   5792 Full    catalog-offsite.2016-12-04_09.00.11_07 has terminated
> with warnings
> ====

The magic to get a "Run After Job" command to happen in the background
is to close the file descriptors. In languages like Perl and Python this
is easy, but for shell scripts it takes a bit of digging to find out
that the way you close a file descriptor is with "exec ${fd}>&-"

What I did was move the real work to the script foo-real, and the foo
script now becomes:


#!/bin/sh

## First close all file descriptors because I think Bareos is waiting on
that.

exec 0>&-
exec 1>&-
exec 2>&-

foo-real &


On the odd chance there is anybody else out there that has Run After
Jobs that they want to put in the background.

-- 
You received this message because you are subscribed to the Google Groups 
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to