On Tue, 22 Aug 2000 at 22:28, Ina Patricia Lopez wrote:
>i am trying to get a document (manual) from a website. But it has too
>many links. There is no pdf or one big file version that i can d/l and
>then print. what i do is click the links and then save. is there a
>tool/utility that could download the whole contents of a site or at a
>particular directory ?
You may want to check out wget. It's a pretty standard utility, so you
should find a pretty recent copy with your distro. :-)
--> Jijo :-)
--
Federico Sevilla III
Network Administrator
THE LEATHER COLLECTION, INC.
#15 Don Mariano Lim Industrial Complex, Alabang-Zapote Road
beside Toyota - Alabang, Las Pinas City 1740 PHILIPPINES
Ofc: +63.2.842.2261
Fax: +63.2.842.2204
Apt: +63.2.523.8251 to 64 (loc 601)
Cel: +63.919.550.4216
-
Philippine Linux Users Group. Web site and archives at http://plug.linux.org.ph
To leave: send "unsubscribe" in the body to [EMAIL PROTECTED]