Hello, maybe you can use flock perldoc -f flock
I have never used this and dont know if it works in your case. /Stefan > I'm posed with a problem, looking for suggestions for possible resolution. I > have a script that has many steps in it, including telnet & ftp sessions, > database unloads, and other routines. This script will run on a server, > accessing a remote server. This works fine. I will likely have several > dozen (maybe as many as 100) iterations of this script running > simultaneously. The problem is, that their is a "bottleneck" towards the end > of my script -- I have to call a 3rd party process that is single-threaded. > This means that if I have ~100 versions of my script running, I can only have > one at a time execute the 3rd party software. It is very likely that > multiple versions will arrive at this bottle-neck junction at the same time. > If I had more than one call the third party program, one will run, one will > loose, and die. > > So I am looking for suggestions on how I might attack this problem. I've > thought about building some sort of external queue (like a simple hash file). > The servers have numbers like server_01, server_02, etc. When a iteration > of the script completes, it writes out it's server name to the file, pauses, > then checks of any other iteration is running the third party software. If > one is running, it waits, with it's server name at the top of the file queue, > waiting. A problem might be if again, two or more versions want to update > this queue file, so I thought maybe a random-wait period before writing to > the file-queue. > > I'm open to other ideas. (please don't suggest we rename or copy the third > party software, it just isn't possible). I'm not looking for code, per se, > but ideas I can implement that will guarantee I will always only have one > copy of the external third party software running (including pre-checks, > queues, etc. > > Thanks, > > Jeff -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]