Hello all,
We have a Macintosh that has been running htdig against our main web server for 
a long time ( the htdig binary has a date of dec/99, not quite sure of the 
version number ). Recently the few indexes we have been running fail and the 
error message we get through the browser where it is run from is

gdbm fatal: file: /usr/local/apache/htdig/db/www/db.docdb.  desc: 5 err: read 
error.

When we check the Web server's log it looks like it grabs the robots.txt file 
but never proceeds ( but the robots.txt file is not restricting it).

It seems to be complaining about the Gnu DBM utility which I believe is used to 
take the info from the search and write the results as a keyword pair into the 
database file "db.docdb"  I am not quite sure how to approach this issue since 
I am more comfortable with Linux and Windows, not the Mac.

Any thoughts would be greatly appreciated.

P.S.  there  is plenty of disk space on the system to create the files it is 
complaining about so I do not think that that is the issue.


_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to