Hi, I am wondering if I can do a custom schedule that runs my backups
between 1:00 am and 5:00 am. I have a few processes that I just need to
backup at this times on a daily basis and I don't want to run them all day
incrementally. Will I basically have todo this:
Schedule {
Name =
Hello,
For reason of space and time I would like to limit the size, is there a
way to make a job fail if the stored size is more than xx bytes (note: I
do the storage on disks).
Matthieu.
--
ThinkGeek and WIRED's
Hi,
I'm having a problem with copy jobs. When the copy job starts, it correctly
chooses the jobs to be copied. But then all but the last job stop with
errors. The last job runs perfectly fine. When I then start the copy job again,
the chosen JobIDs are the ones from the job before minus the
Am Fri, 25 Jun 2010 16:06:00 +0400 schrieb Matthieu Patou:
Hello,
For reason of space and time I would like to limit the size, is there a
way to make a job fail if the stored size is more than xx bytes (note: I
do the storage on disks).
you could limit pool size or job time (see bacula
Hello,
I've been trying to set up a backup of our Oracle database using Bacula,
following the steps in the below link:
http://wiki.bacula.org/doku.php?id=application_specific_backups:oracle_rdbms
However, I encounter the following errors during the backup process:
25-Jun 08:55 tapti18522-fd
nagaraj mogaroy wrote:
Hello,
I've been trying to set up a backup of our Oracle database using
Bacula,
following the steps in the below link:
http://wiki.bacula.org/doku.php?id=application_specific_backups:oracle_rdbms
However, I encounter the following errors during the backup process:
Thomas Mueller tho...@chaschperli.ch kirjoitti viestissä
news:i026rb$tr...@dough.gmane.org...
Am Fri, 25 Jun 2010 16:06:00 +0400 schrieb Matthieu Patou:
Hello,
For reason of space and time I would like to limit the size, is there a
way to make a job fail if the stored size is more than xx
Hi!
I'm going to plan a remote backup between two places in my company
that are connected with 10Mbps connectivity. I need to backup, to be
sure I will not have problems for disaster recovery, all the data in
location A to location B. Up to the moment we have a local backup
working like a charm
Massimiliano Perantoni wrote:
How to solve this problem?
Why not use rsync to keep B synchronised with A on a local (B),
array and perform your bacula backups from this?
Regards,
Richard
--
ThinkGeek and WIRED's
I'm moving some servers across an untrusted network and am trying to enable
TLS and also enable Accurate backups so I don't have to do fulls over the
network. I have a specific client that I'm not able to get a backup to
complete (over a week now at least 20 attempts), it always contacts the SD
Massimiliano,
I would work this with some clever scheduling. Create a few of schedules.
Schedule {
Name = Weekly-incremental
Run = Level=Incremental mon-fri at 20:00
}
Schedule {
Name = Weekly-full
Run = Level=Full Fri at 21:00
}
Schedule {
Name = Catalog-backup
Run = Full
Hi!
I'm going to plan a remote backup between two places in my company
that are connected with 10Mbps connectivity. I need to backup, to be
sure I will not have problems for disaster recovery, all the data in
location A to location B. Up to the moment we have a local backup
working like a
Massimiliano Perantoni wrote:
How to solve this problem?
Why not use rsync to keep B synchronised with A on a local (B),
array and perform your bacula backups from this?
Assuming it's Linux data and assuming they have an additional 4TB (and
growing) amount of space at B then rsync
I finished my conversion and it everything seems OK after a couple of
days of backups. I wrote up my experiences as I did have to go through a
couple of different web pages to get all of the right commands and
needed some help from the list (thanks to all that replied), plus I
needed to modify
An app to install fd? Anyone try compiling bacula is an Arm environment?
Mehma
--
This SF.net email is sponsored by Sprint
What will you do first with EVO, the first 4G phone?
Visit sprint.com/first --
15 matches
Mail list logo