One more for the road, Ive got all of about 5GB of domo lists processed, and I would like to generate a master database of all available databases.
/opt/www/htdig/db/list1 /opt/www/htdig/db/list2 /opt/www/htdig/db/list3 ... Is an idea of where each list has its own databses. htmerge seems to want to support being able to sort them all into a master that someone could search -all- of the lists from, but doesnt come right out and say you can define multiple database_dirs to -do- that. Or can it? Ive been trying to find an example of this with no luck so far. -----Original Message----- From: Geoff Hutchison [mailto:[EMAIL PROTECTED]] Sent: Saturday, November 03, 2001 4:23 PM To: Mohler, Jeff Cc: [EMAIL PROTECTED] Subject: RE: [htdig] General question on database updates.. At 12:04 PM -0800 11/2/01, Mohler, Jeff wrote: >I'll work on digesting all this information over the coming days. >When you say local_urls, are you speaking of feeding rundig/htdig a >directory path to parse instead of a URL itself? Not quite. <http://www.htdig.org/attrs.html#local_urls> Here's an example: start_url: http://www.htdig.org/ local_urls: http://www.htdig.org/=/var/www/htdig/ Yes, it will read the HTML files over the local filesystem--but you're supplying a normal URL. You're also supplying a rule to convert some URLs to local file paths. In general, htdig will try the local filesystem first and then go back to HTTP if it can't find a particular URL. -- -- -Geoff Hutchison Williams Students Online http://wso.williams.edu/ _______________________________________________ htdig-general mailing list <[EMAIL PROTECTED]> To unsubscribe, send a message to <[EMAIL PROTECTED]> with a subject of unsubscribe FAQ: http://htdig.sourceforge.net/FAQ.html

