jobs, it only shows the
first two jobs configured in the client config.
To be safe, I've commented out each job, done a reload and showed the status
again, and it continues to show the first two jobs configured.
I don't know if the jobs will actually run or not, looking for some info on
this.
Jo
is then configured to back up to the s3fs
I'm sure there is more details, and am eager to learn about this.
I'm going to have a few weeks to learn about it while I recover from
surgery next week (donating a kidney), so will be watching for your reply.
Thanks
Jonathan Bayer
On 6/10/2015 6:32 PM
I have a need to change the retention times of a number of jobs, both
completed and new ones. New ones as in the jobs have been defined for a
long time, and I need to change the config to reduce the retention time.
I recall that if I just change the config of an already executed job, it
won't
Thank you, that seemed to do it.
JBB
On 3/16/15 8:39 AM, John Drescher wrote:
On Mon, Mar 16, 2015 at 8:30 AM, Jonathan Bayer
linuxguruga...@gmail.com wrote:
I have a need to change the retention times of a number of jobs, both
completed and new ones. New ones as in the jobs have been
Hi,
Have a problem with an existing setup.
We do backups to disk, here is one of the device definitions:
Device {
Name = FileStorage
Media Type = File
Archive Device = /mnt/baculaStorage
LabelMedia = yes; # lets Bacula label unlabeled media
Random Access = Yes;
Correct me if I'm wrong, but this is merely a reporting tool. It
doesn't any any control mechanisms? No way to start, stop, edit, etc. jobs?
JBB
On 11/7/2014 10:46 AM, oliveiraped wrote:
Hi To All Please check the new web gui for Bacula.
Reportula is a php based web program that provides
Oh, I guess it helps to read.
So it can update config files, but what about included files?
Our configuration has two directories inside the /etc/bacula directory,
one for client configs, one for general configs for various filesets,
pools schedules, etc.
JBB
On 11/7/2014 2:35 PM, Jonathan
This happened to me the other day.
Make sure that ALL the bacula processes are stopped. My problem was
that one of them was continuing to hang around with the old config.
Drove me up the wall until I figured it out.
JBB
On 10/7/14, 2:00 PM, Bryn Hughes wrote:
I've been having some issues
it wasn't an old process hanging about. Plus starting everything
manually in debug mode rules that out.
Bryn
On 14-10-07 12:01 PM, Jonathan Bayer wrote:
This happened to me the other day.
Make sure that ALL the bacula processes are stopped. My problem was
that one of them was continuing to hang
We have a situation where we have a system which is running Ubuntu
12.04. I have bacula 5 client installed on it.
The bacula server is 7.0.5
Can this server talk to the client?
I'm having a problem getting the server to authenticate with the file
daemon:
6-Oct 14:26 baculaOffice-dir JobId
Thanks
JBB
On 10/6/14, 3:10 PM, John Drescher wrote:
On Mon, Oct 6, 2014 at 2:26 PM, Jonathan Bayer linuxguruga...@gmail.com
wrote:
We have a situation where we have a system which is running Ubuntu
12.04. I have bacula 5 client installed on it.
The bacula server is 7.0.5
Can
The documentation talks about using IP addresses a lot.
Can hostname be used instead of the IP addresses? Specifically, the
Addres field in the Client config, as well as others.
Thanks in advance
JBB
--
Great, thanks
JBB
On 8/19/2014 8:12 AM, John Drescher wrote:
On Tue, Aug 19, 2014 at 8:03 AM, Jonathan Bayer
linuxguruga...@gmail.com wrote:
The documentation talks about using IP addresses a lot.
Can hostname be used instead of the IP addresses? Specifically, the
Addres field
We have a small cluster running Alfresco. The web software is on
server1 and the database is on server2.
We have Bacula community version 7.0.5
Bacula client is installed on server1 for now. Obviously, server1 can
access server2's database.
I'm wondering how others have solved this problem.
I finally traced my problem with Webacula down to the following lines in
the index.php:
if ( APPLICATION_ENV == 'production') {
Zend_Session::regenerateId();
}
This is on a CentOS 6 system, with the full php-ZendFramework installed:
php-ZendFramework-full-1.12.7-1.el6.noarch
PHP 5.3:
Figured it out.
The new installation puit the config.ini into a different directory, and
I was updating the wrong one.
Once I fixed it, it is working.
JBB
On 8/8/2014 1:40 AM, Oschwald Robert wrote:
I'm using Webacula 5.5.1 with Bacula 7.0.4 on CentOS 6.5 without any
problem, so I think
, if you know how to fix this problem, that would be great as well.
Thank in advance
Jonathan Bayer
--
Infragistics Professional
Build stunning WinForms apps today!
Reboot your WinForms applications with our WinForms
turned off.
Any ideas?
JBB
On 7/31/2014 7:28 PM, Brady, Mike wrote:
On 2014-08-01 08:03, Jonathan Bayer wrote:
Has anyone developed a working spec file for Bacula 7?
I've taken what is shipped, but had to modify it extensively.
I'll post it here once I've gotten it working nicely
Silly mistake on my part.
I needed to create the database :-)
JBB
On 8/1/2014 8:56 AM, Jonathan Bayer wrote:
Only that whenever I try to go into bconsole, bacula-dir exits
Your files are good, but I don't know if this is a problem with 7.0.5
or the config files.
I'm using the files
Has anyone developed a working spec file for Bacula 7?
I've taken what is shipped, but had to modify it extensively.
I'll post it here once I've gotten it working nicely, but was hoping to
find a better one.
JBB
--
Hi,
We have a few users who, for various reasons, constantly create delete
huge files (hundreds of gigs). I'd like to exclude these from the
backup process.
How can I do that, since I don't know where they can appear or what
their names are?
Thanks in advance.
JBB
Well, if it has been more than 30 days, the entries for each file has
been pruned from the database. However, the job is being retained for 6
months. If the files have been pruned, you still have the option to
restore ALL files from a job, until the job itself is pruned.
JBB
On 6/14/2013
Seems like you asked for a full restore, and one or more of the jobs
which were in the list were more than 30 days old.
Is it possible that yesterday's job was an incremental or differential,
and the last full was more than 30 days old?
JBB
On 6/14/2013 2:00 PM, Jeff MacDonald wrote:
Yes,
The manual states the following:
Run Before Job = command The specified command is run as an
external program prior to running
the current Job. This directive is not required, but if it is
defined, and if the exit code of the program
run is non-zero, the current Bacula job will be
I would like to be able to monitor the number of running jobs with our
monitoring system (Zabbix). To do this I need a script which will
return the number without any additional text.
Is this available anywhere? I don't see an easy way to do this.
Thanks
JBB
Thanks. The SQL is what I needed, I already have scripts which are
querying the database for other info.
JBB
On 6/11/2013 5:26 PM, Marcin Haba wrote:
Dnia 2013-06-11, wto o godzinie 16:27 -0400, Jonathan Bayer pisze:
I would like to be able to monitor the number of running jobs with our
Hi,
We are using Bacula to backup a number of VMs. The backup is being done
to disk volumes.
The configuration I've set up is still being tuned. In the meantime, I
found that there are a number of disk volumes which, which the file(s)
themselves are rather large, all the files on the jobs
If you were trying to use this, the set -u was misplaced. See below
for the correct location
JBB
On 6/7/2013 11:07 AM, Jonathan Bayer wrote:
Hi,
We are using Bacula to backup a number of VMs. The backup is being
done to disk volumes.
The configuration I've set up is still being tuned
wrote:
On 05/29/13 08:12, Jonathan Bayer wrote:
A question.
If the pool is specified both in the schedule and the job resource, which
takes priority if they are different?
Thanks
JBB
The Pool specified in the Schedule resource will override the Pool in the Job
or JobDef resource.
FYI: I
To both Bill and John:
Thank you, that is exactly what I needed~
JBB
On 5/26/2013 11:52 AM, Bill Arlofski wrote:
On 05/26/13 08:42, Jonathan Bayer wrote:
Hello all,
Our system is doing backups to a local hard disk (very big). I'd like
to have three devices set up, one for a full backup
Hello all,
Our system is doing backups to a local hard disk (very big). I'd like
to have three devices set up, one for a full backup, one for a
differential backup and one for incremental.
I've gotten the devices set up, and the pools set up, along with the
appropriate media types. What I
22, 2013 at 4:38 PM, Jonathan Bayer
linuxguruga...@gmail.com mailto:linuxguruga...@gmail.com wrote:
I remember that a while ago, if a directory was included 2 times (such
as the following):
Include {
File = /
File = /usr
}
and /usr was on the same filesystem
Running Bacula 5.2
I'm trying to backup a Postgresql database using fifo's. The following is
the fileset:
# List of files to be backed up
FileSet {
Name = Postgresql dump
Include {
Options {
I remember that a while ago, if a directory was included 2 times (such
as the following):
Include {
File = /
File = /usr
}
and /usr was on the same filesystem, it would be backed up two times.
Is this still the case?
I'm asking because I have a fileset definition which lists a number
I've seen conflicting comments online about this.
I know this used to work.
Assuming only Linux systems, and a further assumption that they will all
be RHEL based systems, does Bacula support any way of bare-metal
restores? And to be even more specific, these would all be VMs.
Thanks in
We are using Zabbix for our monitoring, and I have a script which
updates a template whenever a new bacula job is created.
Once I'm done cleaning it up, I'll be posting it on my blog, and will
post a notice here.
JBB
On 4/29/2013 6:38 PM, Bill Arlofski wrote:
On 04/29/13 18:13, Joseph
, Jonathan Bayer wrote:
Hi,
I'm looking for a way to get daily and weekly automated report. I've
found a few different projects:
send_bacula_backup_report-0.4send a list of all jobs in past x days
bacula-reportsWorks by consolidating emails, I'll know if it works
tomorrow.
breport
Hi,
I'm looking for a way to get daily and weekly automated report. I've
found a few different projects:
send_bacula_backup_report-0.4send a list of all jobs in past x days
bacula-reportsWorks by consolidating emails, I'll know if it works
tomorrow.
breport Java based, but
Hi,
We are using Bacula for local backup, to local hard disks. I am looking
into some sort of remote system, where I can copy files there for
backup.
I'd like to be able to have Bacula open a new disk file, and
automatically label it with a sequential number for every backup. This
way I can
Hi,
Running Bacula 3.0.1 on an Ubuntu 9.05 system.
Main backups are done to an internal hard disk. Pool is called
Internal, volume name is Internal
VirtualFull backups are done to an external hard disk. Pool is called
External, volume name is D1
The internal pool is mounted in:
at the storage on the job's log. You may want to choose
to put the storage= in the schedule resource just to make sure.
Dirk
On Mon, 2009-06-15 at 10:52 -0400, Jonathan Bayer wrote:
Hi,
Running Bacula 3.0.1 on an Ubuntu 9.05 system.
Main backups are done to an internal hard disk. Pool
at the storage on the job's log. You may want to choose
to put the storage= in the schedule resource just to make sure.
Dirk
On Mon, 2009-06-15 at 10:52 -0400, Jonathan Bayer wrote:
Hi,
Running Bacula 3.0.1 on an Ubuntu 9.05 system.
Main backups are done to an internal hard disk. Pool is called
On Mon, 2009-06-15 at 21:09 +0200, Arno Lehmann wrote:
Hello,
15.06.2009 19:49, Dirk Bartley wrote:
OK, just re-read the storage= option under pool and it is a little bit
confusing to me because I always thought the storage came from the job
defaults, then the schedule resource for that
On Mon, 2009-06-15 at 16:17 -0400, John Drescher wrote:
On Mon, Jun 15, 2009 at 3:38 PM, Jonathan Bayerjba...@regiscope.com wrote:
On Mon, 2009-06-15 at 21:09 +0200, Arno Lehmann wrote:
Hello,
15.06.2009 19:49, Dirk Bartley wrote:
OK, just re-read the storage= option under pool and it
I got excited when I read the description of the VirtualFull option. In
my excitement I neglected to re-read the description of the storage
types.
My mistake, thank you all for your comments.
JBB
On Mon, 2009-06-15 at 16:59 -0400, John Drescher wrote:
That is fine if a user can insert the
My Bacula 3.01 system (installed on Ubuntu 9.04) is doing File backups
to an internal, terabyte volume. Bacula is using PostgreSql as a
database backend.
The system is an Ubuntu 9.04 desktop system, with a terabyte internal
volume for the backups, and an external terabyte volume for VirtualFull
Additional info (trouble with my original email address, so am using
this one for now)
I purged the volume on the external pool, and tried restarting one of
the jobs which were failing. This time it is working.
JBB
On Fri, 2009-06-05 at 11:35 -0400, Jonathan B Bayer wrote:
Hello
47 matches
Mail list logo