wget

is what you are looking for.

Niclas

Antony Stace wrote:

> Hi Folks
>
> I am after a program which is really small and grabs a web page, ie
>
> $grab www.excite.com
>
> and returns the html to STDIO or to a file.  I do not want to use perl
> since loading perl is a large overhead for what I am doing.  I have
> tried to use lynx, ie lynx -dump but this dosn't return html, but rather
> a plan text version of a web page with the html stripped out.  Is there
> any other way to do this in lynx??
> Any help appreciated.
>
> Cheers
>
> Tony
>
> -
> To unsubscribe from this list: send the line "unsubscribe linux-newbie" in
> the body of a message to [EMAIL PROTECTED]
> Please read the FAQ at http://www.linux-learn.org/faqs


-
To unsubscribe from this list: send the line "unsubscribe linux-newbie" in
the body of a message to [EMAIL PROTECTED]
Please read the FAQ at http://www.linux-learn.org/faqs

Reply via email to