I think a number of ht://Dig users are using it with large numbers of
documents like yours. I don't know what the cause of the problem is.
Are you now using 3.1.1, or still using 3.1.0b2? I assume this is on
a S.u.S.E. Linux system, with Intel processor?
Can you get a stack backtrace from the core dump? It would help to
know where it is crashing. Do you know what document it was processing
when it crashed? Maybe it's a problem with that. You can also try
htdig -vvv to get more verbose output. Note that this will produce
LOTS of output, especially with close to a hundred thousand documents,
but the last part of that output, just before it crashes, may prove to be
useful.
According to Bodo Bauer:
>
> Hi Gilles,
>
> sorry to bother you again, but I stll get a seg fault every time I run
> htdig -i. Is there any hard limit on the number of pages you can index?
>
> We have about 88.000 pages on the server is this a problem?
>
> Ciao,
> BB
>
> --
> Bodo Bauer S.u.S.E., Inc fon +1-510-835 7873
> [EMAIL PROTECTED] 458 Santa Clara Avenue fax +1-510-835 7875
> http://www.suse.com/~bb Oakland CA, 94610 USA http://www.suse.com
>
--
Gilles R. Detillieux E-mail: <[EMAIL PROTECTED]>
Spinal Cord Research Centre WWW: http://www.scrc.umanitoba.ca/~grdetil
Dept. Physiology, U. of Manitoba Phone: (204)789-3766
Winnipeg, MB R3E 3J7 (Canada) Fax: (204)789-3930
------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED] containing the single word "unsubscribe" in
the SUBJECT of the message.