Bob,

Many thanks - I'll look into it.

Mark

> -----Original Message-----
> From: Bob Showalter [mailto:[EMAIL PROTECTED]]
> Sent: Tuesday, June 04, 2002 9:22 AM
> To: 'HENRY,MARK (HP-Roseville,ex1)'; '[EMAIL PROTECTED]'
> Subject: RE: avoiding 2nd process
> 
> 
> > -----Original Message-----
> > From: HENRY,MARK (HP-Roseville,ex1) [mailto:[EMAIL PROTECTED]]
> > Sent: Tuesday, June 04, 2002 12:15 PM
> > To: '[EMAIL PROTECTED]'
> > Cc: HENRY,MARK (HP-Roseville,ex1)
> > Subject: avoiding 2nd process
> > 
> > 
> > All,
> > 
> > Wondering what the best approach would be to the following.
> > 
> > I have a script which copies files and sends mail notification
> > appropriately.
> > 
> > I want the script to check for new source files every 15 
> > mins, however when
> > there are new files, the operation will take long enough to 
> > the point where
> > a second instance of the script will start while the first is still
> > executing - obviously I don't want two copies running 
> simultaneously.
> > My first thought would be to create a dummy file upon start 
> > and if this
> > exists, a second instance wouldn't begin, then deleting the 
> > file when the
> > operation is complete.
> > 
> > Is there a better way of doing this?
> 
> Use a lock instead of just the presence of a file. If your script dies
> without removing the file, all subsequent calls are blocked until you
> manually remove the file. If you use a lock, the kernel will 
> automatically
> release when your process ends, even if it terminates abnormally.
> 
> See: <http://www.stonehenge.com/merlyn/WebTechniques/col54.html>
> 

-- 
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to