Of course, you could always try simulating the test on your own workstation or another test machine. I mention this only because I usually encourage the "try it and see" method*, as that is usually the most educational.
(*) Of course, this method should not be used against obviously risky tests, like "what happens if I delete all my database and recovery log volumes". With that said, I see some problems: 1) If the file specification contains blank spaces, it should be enclosed in quotes, i.e.: OBJECTS='"\\hmpg1027\dept0162\Public\Grin\To Archive"' Note the double quotes around the file spec, followed by the single quotes around the double-quoted string. This is discussed in the TSM Admin Reference in the DEFINE SCHEDULE command. 2) "To Archive" would appear to be a directory, so it should be terminated with "\": OBJECTS='"\\hmpg1027\dept0162\Public\Grin\To Archive\"' 3) Make sure the account used to run the schedule has access to network resources. The local system account does not have this access, so you will need to use another account that can access the network resource at the time the scheduled event runs. 4) For best chances of success the first time out, I suggest testing in a minimum of two phases: a) After correcting the schedule definition, update the start time so it will run now. Then run "dsmc schedule" from the client machine to see that it picks up the schedule correctly and starts processing the data (log in as the same account that will be used to perform the regularly scheduled archives). You don't need to let it run to completion, but just for a minute or two; such a test will act as a "sanity check". If you use QUIET, disable that option until you can confirm the schedule behaves correctly. If the schedule doesn't run correctly, then make any necessary corrections and retest until it works. b) Once test (a) runs correctly, you can delete the archived data (assuming it contains only archive data from the test -- be careful not to delete any "real" archive data). Unless you already do a lot of archiving of this data on a daily basis, you should be able to identify the test data via the archive DESCRIPTION field, which (based on your definition) will default to "Archive Date: mm/dd/yyyy". Update the schedule's start time to when you actually want it to run. Start the scheduler service on the client, and check the dsmsched.log to make sure it picked up the archive schedule. Then see how it runs. Regards, Andy Andy Raibeck IBM Software Group Tivoli Storage Manager Client Development Internal Notes e-mail: Andrew Raibeck/Tucson/[EMAIL PROTECTED] Internet e-mail: [EMAIL PROTECTED] The only dumb question is the one that goes unasked. The command line is your friend. "Good enough" is the enemy of excellence. "ADSM: Dist Stor Manager" <[EMAIL PROTECTED]> wrote on 07/13/2004 06:42:00: > Hello everyone! > > I am beginning to automate the archive processes for the servers. I was > wondering if I have the correct parameters to archive all directories, > subdirectories and files for \\hmpg1027\dept0162\Public\Grin\To Archive > with a management class of arc365 and then delete all of the files after > they were successfully archived to TSM? I will be testing this with a > subdirectory, but the users have not provided me with test data to see if > this works... Thank you in advance for any advice! > > Policy Domain Name: DESKTOP > Schedule Name: HMPG1027_ARCHIVES > Description: > Action: Archive > Options: -archmc=arc365 -subdir=yes -deletefiles > Objects: \\hmpg1027\dept0162\Public\Grin\To Archive > Priority: 5 > Start Date/Time: 07/13/2004 23:00:00 > Duration: 1 Hour(s) > Period: 1 Month(s) > Day of Week: Any > Expiration: > Last Update by (administrator): LIDZR8V > > > ******************************** > Joni Moyer > Highmark > Storage Systems > Work:(717)302-6603 > Fax:(717)302-5974 > [EMAIL PROTECTED] > ********************************