Hi Rob, My requirement is like this:-
There is a parent directory where the input files will be stored. These files will keep on coming into the parent directory. There is a child directory where these input files will just get touched(0 byte files). I have to read the name of the file from the child directory and then go to the parent directory to process it. Only those files in the parent directory will be processed which have been touched in the child directory. Also, these input files will keep on coming from some source. So, my perl file has to run continuously and pick each of the files as they come. Suggestions to do this will be highly appreciated. Thanks, Mihir On 8/1/07, Rob Dixon <[EMAIL PROTECTED]> wrote: > > Mihir Kamdar wrote: > > > > I have a requirement to read each of the files from a directory and pass > > each one of them as a parameter to my perl script one by one. The > > requirement is such that my perl script should keep running in the > > background and it should take each of the files as they enter into the > > target directory, pass that as a parameter to my perl script, process it > and > > write the output to a different directory. > > > > Ex:- $cd success > > success>$ls > > success>file1 file2 file3 file4 file5 > > > > $cd > > home>perl test.pl /home/success/file1 > > home>perl test.pl /home/success/file2 > > ... > > ... > > > > Please suggest how to go about this. Are there any perl modules to scan > > through a directory and pick each of the files to process. > > > > Once the file is taken from the input directory and processed and output > > written to a different output directory, I can delete that file from the > > input directory. > > A Perl program can easily find what files are in a directory and process > and > delete each of them. I think that is the way to go rather than having an > external program calling a Perl script to process the files one by one. Is > there any reason why you can't do this? > > Rob >