It's not ugly to implement - I've done it. 1 line of code. Just need to
pass in the name of a script to call on the command prompt. Perhaps
another flag to say whether to run the script in the background or not.
How you implement that in windows I don't know.
K.
Jonathan wrote:
Now I see what you are trying to do... we do something similar in that
we have an external process which wakes up every n minutes, processes
whatever files wget has downloaded, then goes back to sleep. This is
not the same as having wget invoke a separate script/routine after each
file download (or download "attempt" if the download failed).
Having wget invoke a separate script/routine after each file download
would be fairly ugly to implement on a wide variety of o/s's - do you
want wget to spawn a new process after each d/l so that wget can carry
on or would you want to halt wget processing, while the post processing
routine runs? Both options have drawbacks, and could be very ugly to
implement/support across o/s's.
I can't see the wget developers getting really excited about this
feature, but you never know...
J.
It's better to have that functionality inside wget because wget may
not sucessfully download the file. Also, wget may be downloading
multiple files (usually the case) so, wrapping wget up in a script
will not allow you to process each downloaded file.
K.
Jonathan wrote:
any thoughts on this new feature for wget:
when a file has been downloaded successfully, wget calls a script
with that file as an argument.
e.g. wget -r --post-processing='myprocessor.pl' URL
would call myprocessor.pl with each file which has been downloaded.
I've already hacked the source to get wget to do this for me, but I
think many people could benefit from this.
It's much easier to throw wget into a script or other program and
then have that script/program do the post processing (why mess around
with wget?).
Jonathan