On Wed Jul 05 2000 at 13:20, Niclas Hedhman wrote:
> wget
>
> is what you are looking for.
Yes, indeed.
However beware that wget does have its quirks, and a very careful
look (and experimentation) is needed at exactly what command line
parameters it is given before you find a formula that does exactly
as you want. But you _will_ find that it will do it.
> Niclas
>
> Antony Stace wrote:
>
> > Hi Folks
> >
> > I am after a program which is really small and grabs a web page, ie
> >
> > $grab www.excite.com
> >
> > and returns the html to STDIO or to a file. I do not want to use perl
[...]
Some of the mirror programs (like "mirror") are perl-based).
> > I have
> > tried to use lynx, ie lynx -dump but this dosn't return html, but rather
> > a plan text version of a web page with the html stripped out. Is there
> > any other way to do this in lynx??
lynx *can* do this... use the -source switch (and others, like
-mime-header). Also look at its "traversal" switches. Overall,
lynx is a very cool text-mode html browser - it has a hell of a lot
of features.
In fact, there are several other utilities out there that can do
html and ftp mirroring (and do neat things like altering the URLs to
be relative rather than absolute).
Cheers
Tony
-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-
Tony Nugent <[EMAIL PROTECTED]> Systems Administrator, RHCE
GrowZone OnLine (a project of) GrowZone Development Network
POBox 475 Toowoomba Oueensland Australia 4350 Ph: 07 4637 8322
-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-=*#*=-