I'm testing Bacula, so this is not a production setup and I ran into an issue, problem, bug, I'm not sure. During my testing, I'm backing up a file system of about 700gb to a disk. A FULL backup took around 21 hours with compression turned on. While this was happening the remaining backup jobs were queued waiting for the big job to finish. Once the big 700gb job was done the remaining jobs ran and all was well.
The problem came when it was time to do the big job again, I thought it would have ran an incremental as all the other backup jobs have done, but bacula decided to re-run a FULL job again. I'm not sure why or where to look to try and figure out why. Anyone have any thoughts? Some other minor questions.. Does bacula use the /etc/dumpdates file? (I don't think so) Are there any limitations to file, folder or file system sizes that bacula can't do? Thank you! mike ------------------------------------------------------------------------------ Free Software Download: Index, Search & Analyze Logs and other IT data in Real-Time with Splunk. Collect, index and harness all the fast moving IT data generated by your applications, servers and devices whether physical, virtual or in the cloud. Deliver compliance at lower cost and gain new business insights. http://p.sf.net/sfu/splunk-dev2dev _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users