Is there a way to dump/crawl/spider the entire Wiki, including a "snap shot" in a distribution? (AndyG?)
wget -r http://lifewithbincimap.org/index.php/Main/HelpItDoesntWork
Except that it gets ABSOLUTELY EVERYTHING on lifewithbinciamp.org (including all the previous revisions) and it does update the interal URL's to refer to local files.
A database of Web Robots can be found at http://www.robotstxt.org/wc/active/html/index.html. Have fun:-)
Henry
Andy :-)
-- Andreas Aardal Hanssen | http://www.andreas.hanssen.name/gpg Author of Binc IMAP | "It is better not to do something http://www.bincimap.org/ | than to do it poorly."
-- Henry Baragar Principal, Technical Architecture 416-453-5626 Instantiated Software Inc. http://www.instantiated.ca
