Hello, On Monday 29 January 2007 21:19, Alan Davis wrote: > Kern, > > Thanks for the fast response. To clarify a bit - the file list that I > would be using would be individual files, not directories. There would > be no exclude list as only the files that I need backed up would be > listed.
Yes, my answer was based on that assumption. > > I have about 30TB of data files spread over several hundred directories. > A true incremental backup will spend large amounts of time determining > what files have been changed or added. The information about the > modified or new files is stored in a db as a side-effect of processing > the files for release to production so building a file list is trivial. > The only problem would be the FD's capability of handling a file list of > 10K+ entries. All I can say is to try it, but I won't be surprised if it chews up a lot of CPU. However, doing an equivalent of an incremental backup by means of an exclusion list doesn't seem possible to me. Bacula is really quite fast in traversing a very large filesystem during an incremental backup. > > Thanks. > > ---- > Alan Davis > Senior Architect > Ruckus Network, Inc. > 703.464.6578 (o) > 410.365.7175 (m) > [EMAIL PROTECTED] > alancdavis AIM > > > -----Original Message----- > > From: Kern Sibbald [mailto:[EMAIL PROTECTED] > > Sent: Monday, January 29, 2007 2:47 PM > > To: bacula-users@lists.sourceforge.net > > Cc: Alan Davis > > Subject: Re: [Bacula-users] Experience with extremely large fileset > > include lists? > > > > On Monday 29 January 2007 18:17, Alan Davis wrote: > > > I understand that one of the projects is to incorporate features > > that > > > > will make very large exclude lists feasible, but does anyone have > > > experience, good or bad, with very large include lists in a fileset? > > > > > > > > > > > > I'm looking at the possibility of building a backup list from a db > > query > > > > that has the potential to return tens of thousands of files stored > > in > > > > hundreds of directories. > > > > For each file in the directories you specify (normally your whole > > filesystem), > > Bacula will do a linear search through the exclude list. Thus it > > could be > > > extremely CPU intensive. For a large list (more than 1000 files) I > > believe > > it (the list) needs to be put into a hash tree, which is code that > > does > > > not > > exist. > > > > > Thanks > > > > > > > > > > > > ---- > > > > > > Alan Davis > > > > > > Senior Architect > > > > > > Ruckus Network, Inc. > > > > > > 703.464.6578 (o) > > > > > > 410.365.7175 (m) > > > > > > [EMAIL PROTECTED] > > > > > > alancdavis AIM > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share > your opinions on IT & business topics through brief surveys - and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > Bacula-users mailing list > Bacula-users@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/bacula-users ------------------------------------------------------------------------- Take Surveys. Earn Cash. Influence the Future of IT Join SourceForge.net's Techsay panel and you'll get the chance to share your opinions on IT & business topics through brief surveys - and earn cash http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users