If one is to create a script to ensure that the files on the filesystem are backed upon before removing them, what is the best data-store model for doing so?
Obviously, if you have > 1,000,000 files in the catalog and you need to check each of those, you do not want to bplist -B -C -R 999999 /path/to/file/1.txt for each file. However, you do not want to grep "1" one_gigabyte_catalog.txt either as there is really too much overhead in either case. I have a few ideas that involves neither of these, but I was wondering if anyone out there had already done something similar to this that was high performance? Justin. _______________________________________________ Veritas-bu maillist - Veritas-bu@mailman.eng.auburn.edu http://mailman.eng.auburn.edu/mailman/listinfo/veritas-bu