HI All,

I have a client who wants to be able to download/cache all web files
on certain sites for archiving info (i.e. before purchases, anything
where a record of what exactly was on a certain page is needed).  They
would like to have this on the web server so employees could log in
and enter the sites for caching and then it can have a cron job and do
this at night.

Is there an OS program that will take a url and crawl/cache all the links on it?

If so I can easily create the interface, job queuing, and chron job setup.

Thanks,

-- 
Leonard Burton, N9URK
http://www.jiffyslides.com
[EMAIL PROTECTED]
[EMAIL PROTECTED]

"The prolonged evacuation would have dramatically affected the
survivability of the occupants."

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to