This seems to me like one of those "more than one way" things - you could 
create a daemon or a cronjob to handle this for ya, depending on your needs 
(speed, efficiency and delay time as important factors that I can think of 
off the top of my head).

Anyway, whatever you end up choosing, you can use the readdir() Perl function 
(assuming you're running this program locally on the FTP server) to read in 
all the files (fyi: it also reads in '.' and '..' [but maybe just on UNIX]) 
and compare them using the stat() function - if any files are more recent 
than the last time the directory checked (which if you use a cronjob, would 
be the recurrence value or with a daemon, the sleep value), then those are 
the ones you want to download (which you could feed into a system call to 
ncftpput or ncftpget - the Net::FTP module may also be of use).  If you're 
running the program remotely, I would hunch that Net::FTP can handle most of 
this for you (getting a list of files in the directory with file statistics 
and such).

Hope that helps ... just some of my random thoughts on your problem. ;)

Jason

If memory serves me right, on Monday 11 March 2002 12:03, Craig Sharp wrote:
> Ok, here is a good one.
>
> I need to be able to monitor a directory on an ftp server and then download
> the files when they have been uploaded to the server.
>
> The problem is how to determine when a file is there and if a file is
> there, that the file is done being uploaded from the source.
>
> I was going to look at a bot but I am sure that there is a way to
> automatically do this with perl.  I just dont know where to start.
>
> Thanks,
>
> Craig
>
> _______________________________________________
> Perl-Unix-Users mailing list
> [EMAIL PROTECTED]
> To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs
_______________________________________________
Perl-Unix-Users mailing list
[EMAIL PROTECTED]
To unsubscribe: http://listserv.ActiveState.com/mailman/mysubs

Reply via email to