On Mon, Apr 03, 2006 at 05:37:34PM -0400, Christopher Conroy wrote:
> I haven't used it myself but it's my understanding that curl offers some
> extra functionality that wget does not. Of course it all depends on what you
> want to do, but if you want to say archive an entire site, then Curl might
> be up your alley.

If I remember correctly, curl supports https, which I believe wget
does not.  However, for what I'm doing, wget seems to be the right thing.
FWIW, I'm trying to get a list of all webservers, everywhere[1].  I don't
care about the content, but obviously I would have to parse the content
to get to more webservers.

Thanks again,

- Rob
.



[1] think "Leon, the Proffessional" - Best.Line.Ever.[2]
http://www.imdb.com/title/tt0110413/

Gary Olman, calmly, violently pissed: "Bring me... everyone"
Stooge: "Everyone?  What... what do you mean everyone?"
Gary Olman, screaming " EEEEEEEEEVERRYOOOOOOOOONE!"
[scene change to sirens screaming, police cars rolling, and helicopters
launching]


[2] Also, best party theme ever but I guess you had to be (one of
        the 250+ people) there  :-)

Reply via email to