Alvarez, Angelo CIV NAVPACMETOCCEN JTWC wrote:
Classification:UNCLASSIFIED
Aloha,
I am looking for a script (perl, etc.) or application which can retrieve files 
using https, monitor a server for new files,
> and execute commands for each file which was retrieved.
We currently receive at least 1.3GB of data each day, so it should be robust 
enough to handle that load.  Thanks.

Angelo,

Wget supports the grabbing of files over https. As for monitoring a server for new files, using the -N option will allow you see if there are any new files and download them.

To execute commands on the downloaded files, just put wget into a script, download the file to a "staging" area, then do what you need to the downloaded file.

And it works on NMCI or legacy. No installing required... so you or your clients won't get into trouble by running in on NMCI seats.

If you need an example, I can show you the script I use to populate the SIPR webpage here... it is rudimentary, but will give you a basic idea...


Best regards,
Bryan Brake
Webmaster / Unix Admin
NAVPACMETOC Center San Diego

Reply via email to