Aren't there ways of downloading whole websites onto your local machine. Kind of "follow all links from <url> but stop when going outside of site <site>". I'm hopelessly ignorant but this seems like something that must exist
S | -----Original Message----- | From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] | On Behalf Of Claus Reinke | Sent: 12 September 2007 14:14 | To: Ian Lynagh | Cc: [email protected] | Subject: Re: regarding testsuite | | >>> - the usage documentation page is only online. it should be | >>> copied into the download, for offline users. | >> But then we have the problem of keeping it in sync. | > | > that is exactly the problem your users run into, only aggravated | > by not having the page in the first place. can't the trak wiki push | > the plain text version of the wiki page into the repo after each | edit? | | or, how about a separate repo for all the parts of the developer | wiki relevant to building, testing, hacking, and a cron job that | records and pushes from the wiki files nightly? | | i still tend to work offline a lot, especially when i want to get | things done without distractions. and i always run into this | "everyone is online" issue with ghc: the building guide, the | test instructions, the commentary, the validation guidelines, | .. all unavailable if i go offline after 'darcs-all pull'.. | | those who like or need to be offline when hacking | could then simply add the wikidocs package to the | darcs-all targets, and happily work in peace (or in a | plane, where much of ghc was born?-). | | claus | | _______________________________________________ | Cvs-ghc mailing list | [email protected] | http://www.haskell.org/mailman/listinfo/cvs-ghc _______________________________________________ Cvs-ghc mailing list [email protected] http://www.haskell.org/mailman/listinfo/cvs-ghc
