I would like to hear some experiences others have had regarding setting schedule randomization and maximum schedule sessions. What I am looking for is the best compromise for best efficiency. We currently are running a 6 hour wide start window for our nightly backups with a maximum schedule sessions of 36. We back up approximately 74 nodes per night. This is working well for us most of the time. I am looking to increase backup efficiency for those occasional nights where a few nodes back up a larger than normal amount of data and the backups run into daytime. In the past we've used a lower figure for randomization however on occasion, all the nodes clobbering the server at once have caused the recovery log to become alarmingly close to filling. (We are in Normal Mode.) A couple times the log filled up and caused the operator system console com buffers to overflow. The log size was increased as was the schedule randomization percentage to reduce the parallel load on the server. My question is would I gain efficiency by reducing the amount of schedule sessions to the point where it doesn't stress the server and drop randomization to 0? My thought is that I will gain maximum efficiency this way since if I set max sched sess to say 25 there will be always 25 sessions running in parallel and as soon as one completes, another would immediately kick in rather than waiting a random amount of time as set by sched. rand. pct. Is my thinking correct and do you see a flaw in my logic? Thanks, Al Alan Davenport [EMAIL PROTECTED] Selective Insurance
