OK,This is the printout of the stack trace while failing to indexing the
190,000th ocument
Indexing C:\sweetpea\wikipedia_xmlfiles\part-18\491886.xml
Indexing C:\sweetpea\wikipedia_xmlfiles\part-18\491887.xml
Indexing C:\sweetpea\wikipedia_xmlfiles\part-18\491891.xml
On Jan 28, 2007, at 9:15 PM, maureen tanuwidjaja wrote:
OK,This is the printout of the stack trace while failing to
indexing the 190,000th ocument
java.io.IOException: There is not enough space on the disk
Can anyone help?
Ummm get more disk space?!
Erik
I think so ...btw may I ask the opinion, will it be useful to optimize let say
every 50,000-60,000 documents? I have total of 660,000 docs...
Erik Hatcher [EMAIL PROTECTED] wrote:
On Jan 28, 2007, at 9:15 PM, maureen tanuwidjaja wrote:
OK,This is the printout of the stack trace while failing
On Jan 28, 2007, at 11:23 PM, maureen tanuwidjaja wrote:
I think so ...btw may I ask the opinion, will it be useful to
optimize let say every 50,000-60,000 documents? I have total of
660,000 docs...
Lucene automatically merges segments periodically during large
indexing runs. Look at