hi ina,

you could try using "wget" to grab the whole url.


---lem


Ina Patricia Lopez wrote:
> 
> hi!
>    i am trying to get a document (manual) from a website. But it has
> too  many links.  There is no pdf or one big file version that i can
> d/l and then print.  what i do is click the links and then save.  is
> there a tool/utility that could download the whole contents of a site
> or at a particular directory ?
> 
> thanks.
> ina patricia
> 

---
   Why reinvent the wheel???
---
  .--.  Lemuel C. Tomas                       office://+63.2.894.3592/
 ( () ) Q Linux Solutions, Inc.               http://www.q-linux.com/    
  `--\\ A Philippine Open Source Solutions Company
-
Philippine Linux Users Group. Web site and archives at http://plug.linux.org.ph
To leave: send "unsubscribe" in the body to [EMAIL PROTECTED]

Reply via email to