my _tiny_ little nook; probably not undocumented, but it sure took me a long
while to find, and when I did, it solved all my problems!:
The option to "Set source files' backup time" in a script was very useful
for me... you can then build a selector which only backs up files modified
since their individual backup time...
Yes, I hear you say, but WHY! Retrospect already does incremental backups!
Tricky to explain, but what my backup script essentially does is COLLECT
(copies of) data from machines, to a file storage set... When this is
~600MB, we burn it to CD, discard the file storage set, and start a new file
storage set. Now we don't want the NEW storage set to get immediately full
with stuff already backed up and burned...
-if you understand the explanation, you probably could have solved the
problem yourself! And no, we couldn't use an "Archive" script to remove the
data as it was backed up, because the users need it to be left on their
Now, if anyone could tell me how to get Retrospect to start a new (file)
storage set AUTOMATICALLY when a size limit is exceeded, that would be
great! (And no, sorry, we don't want to trust to Packet-written CDs, written
"live" as the backup proceeds- we like our file storage sets!)
Anyway; my two ha'pence worth- nearly equivalent to 2 cents
Tom Lawton Senior Instrumentation Technician
EnFlo (Environmental Flow Research Centre)
School of Mechanical and Materials Engineering
University of Surrey
GUILDFORD Surrey GU2 7XH UK
Tel: +44 (0)1483 87 9687
Fax: +44 (0)1483 87 9546
Mobile: +44 (0)7715 126 377
email: [EMAIL PROTECTED]
SMS: [EMAIL PROTECTED]
To subscribe: [EMAIL PROTECTED]
To unsubscribe: [EMAIL PROTECTED]
For urgent issues, please contact Dantz technical support directly at
[EMAIL PROTECTED] or 925.253.3050.