En Wed, Aug 23, 2000 at 02:12:45PM +0800, Federico Sevilla III escribio:
#_ On Tue, 22 Aug 2000 at 22:28, Ina Patricia Lopez wrote:
#_ >i am trying to get a document (manual) from a website. But it has too
#_ >many links.  There is no pdf or one big file version that i can d/l and
#_ >then print.  what i do is click the links and then save.  is there a
#_ >tool/utility that could download the whole contents of a site or at a
#_ >particular directory ?
#_ 
#_ You may want to check out wget. It's a pretty standard utility, so you
#_ should find a pretty recent copy with your distro. :-)
 
To read the documentation of wget type this command:

  info -f wget


-- 
Juan Miguel Cacho       [EMAIL PROTECTED] �
Philippines             [EMAIL PROTECTED] 
...the poor count their blessings, the affluent count their calories.
-
Philippine Linux Users Group. Web site and archives at http://plug.linux.org.ph
To leave: send "unsubscribe" in the body to [EMAIL PROTECTED]

Reply via email to