Justin, Not sure what your timeline is but we run a script to pull the status daily and compile it into a db. That way there is not much overhead because it is only the last 24 hours back ups but the DB gives us flexibility to run the reports etc.
Thanks. Phil 456-3136 -----Original Message----- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Justin Piszcz Sent: Monday, March 26, 2007 4:27 PM To: Veritas-bu@mailman.eng.auburn.edu Subject: [Veritas-bu] Checking to see if millions of files are backed up? If one is to create a script to ensure that the files on the filesystem are backed upon before removing them, what is the best data-store model for doing so? Obviously, if you have > 1,000,000 files in the catalog and you need to check each of those, you do not want to bplist -B -C -R 999999 /path/to/file/1.txt for each file. However, you do not want to grep "1" one_gigabyte_catalog.txt either as there is really too much overhead in either case. I have a few ideas that involves neither of these, but I was wondering if anyone out there had already done something similar to this that was high performance? Justin. _______________________________________________ Veritas-bu maillist - Veritas-bu@mailman.eng.auburn.edu http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu _______________________________________________ Veritas-bu maillist - Veritas-bu@mailman.eng.auburn.edu http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu