On Sat, 26 May 2001, David wrote:

> Does anyone know of a piece of softare that crawls a web site and checks
> bad links?

I use lynx --traverse --crawl http://www.victim.com

It creates a whole bunch of files, look in the .dat files for failed URLs
etc. I am still looking for an Astra Site Manager equivalent for Unices.

Cheers,
Mikal

-- 
Michael Still ([EMAIL PROTECTED])
  http://www.stillhq.com -- a whole bunch of Open Source stuff including PDF 
software...

"Grrrrrrr! I'm a volleyballing machine!"


-- 
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://lists.slug.org.au/listinfo/slug

Reply via email to