Greets - I have a new installation of Bacula on Debian Jessie
(5.2.6+dfsg-9.3) with a postgres backend. This is not my first bacula
installation (I have 5 other separate instances running on different
tape units, some single TL2000, some library changers - all happy) --
except for this new one (maybe).
I never noticed exactly when bacula writes data to the file and filename
tables - I figured it did it on the fly.
In this current instance, I am running a 4TB+ job on a single TL2000 SAS
drive, and I'm currently on tape 2 of probably 5 or 6 (depending on
compression). However, there are only 7 entries in the "file" table:
1;1;5;4;1;0;0;"GgB ZgAI EHt C A A A BAA BAA I BaB4NI BaB4NI BaB4NI A A
C";"0"
2;2;5;7;1;0;0;"gC B EHt B A A A BAAA BAA IA A A A A A C";"0"
3;3;5;3;1;0;0;"gy B EHt B A A A BAAA BAA IA A A A A A C";"0"
4;4;5;6;1;0;0;"gR B EHt B A A A BAAA BAA IA A A A A A C";"0"
5;5;5;5;1;0;0;"gi B EHt B A A A BAAA BAA IA A A A A A C";"0"
6;6;5;2;1;0;0;"GgB ZgAH EHt C A A A BAA BAA I BaB4NE BaB4NE BaB4NE A A
C";"0"
7;7;5;1;1;0;0;"GgB ZgAC EHt I A A A BAA BAA I BaB4eL BaB4NI BaB4NI A A
C";"0"
and only one entry in the "filename" table:
1;"''"
Which I find rather odd.
I compared against my other sites, and they have plenty of entries, all
as expected (filenames, hashes, etc).
I am using a simplified job referencing against a pool of pre-labeled
tapes in "Default", the catalog "MyCatalog", 5 specific folders in Full
backup mode, and it was started manually in bconsole.
The database on the whole is functioning as there are plenty of log
entries, it updates the media and status tables (my favorite), but no
data of value in file and filename.
Tests listed in the FAQ show the database and sequences (indexes for the
mysql-peeps) are in ok shape and all exist. The dbase log and verbose
log don't show any dbase errors or errors related to "MyCatalog".
There are no other tables or dbases visible that seem to show temporary
storage info. Is this normal behavior, or is my large backup slated to
be an orphan?
If I need to make a fix, I'd prefer to fix it before running another
large backup, or into the scheduled daily backups (I'd rather not have
to bextract every job afterwards).
I have another day-ish of this job running (hey, its 4TB of usb-attached
storage going into the vault, no speed records to be broken), so any
thoughts would be appreciated.
Regards,
Ted.
---
This email has been checked for viruses by Avast antivirus software.
https://www.avast.com/antivirus
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users