Austin wrote: > > On 2003.05.28 20:12, David Walser wrote: >> > I use David's script. Works great! >> Glad to hear it. > > David, > I use your script daily (on the command line) and it works great. > One problem I'm worried about though... > I want to switch it to a cron job, but I noticed that it just hangs on rsync > errors. > Like a "too many connections" error, or something similar.
I noticed that, and the perl code says basically, open an rsync process, the stdout for which will become a filehandle, and if rsync exits with an error, die (the script should exit). It's not working, which leads me to believe that rsync is failing to die. I think rsync complains about the too many connections (stderr) and then hangs, so rsync appears to be the problem. If someone can confirm that, we need to notify the rsync authors so they can fix it. If rsync is dying properly, then I need some help with why the Perl isn't working right. > I'm afraid of stale instances piling up. > Does your script have a timeout that I've missed, or can one be added to the > rsync call? The script doesn't currently use the timeout option. I hope it doesn't prove necessary. If you figure anything more out about this, let me know. One thing I don't remember is what happened the one time the first rsync (line 58) worked, but it didn't get the too many connection error until the next one. Usually if it gets it at all it's on the first one, because between the rest it usually reconnects fast enough to get the spot it was using. > Austin >
