<quote who="Michael Still">

> I use lynx --traverse --crawl http://www.victim.com
> 
> It creates a whole bunch of files, look in the .dat files for failed URLs
> etc.

lazarus: ~
$ apt-cache search linkcheck
linkchecker - check HTML documents for broken links
linkchecker-ssl - HTTPS module for linkchecker

lazarus: ~
$ apt-cache show linkchecker
Package: linkchecker
Priority: optional
Section: web
Installed-Size: 624
Maintainer: Bastian Kleineidam <[EMAIL PROTECTED]>
Architecture: all
Version: 1.3.1
Depends: python2-base
Suggests: linkchecker-ssl (>= 1.3.1)
Filename: pool/main/l/linkchecker/linkchecker_1.3.1_all.deb
Size: 89114
MD5sum: 37f8b3b5277b6bd693a1eaee2b452971
Description: check HTML documents for broken links
 Features:
  o recursive checking
  o multithreaded
  o output can be colored or normal text, HTML, SQL, CSV or a sitemap
    graph in GML or XML
  o HTTP/1.1, FTP, mailto:, nntp:, news:, Gopher, Telnet and local
    file links are supported
  o restrict link checking with regular expression filters for URLs
  o proxy support
  o give username/password for HTTP and FTP authorization
  o robots.txt exclusion protocol support
  o i18n support
  o command line interface
  o (Fast)CGI web interface (requires HTTP server)

There are a number of these packages around, just meatforgefreshdot for them.

- Jeff

-- 
           What do you get when you cross a web server and a hen?           
                                  Apoache.                                  

-- 
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://lists.slug.org.au/listinfo/slug

Reply via email to