Hi all
I am using lucene to index a large dataset, it so happens 10% of this data
yields indexes of
400MB, in all likelihood it is possible the index may go upto 7GB.
My deployment will be on a linux/tomcat system, what will be a better
solution
a) create one large index and hope linux
List
Subject: Large index files
Hi all
I am using lucene to index a large dataset, it so happens 10% of this data
yields indexes of
400MB, in all likelihood it is possible the index may go upto 7GB.
My deployment will be on a linux/tomcat system, what will be a better
solution
a) create
then doing the same on win2000 PF]
with regards
Karthik
-Original Message-
From: Rupinder Singh Mazara [mailto:[EMAIL PROTECTED]
Sent: Friday, July 23, 2004 5:55 PM
To: Lucene Users List
Subject: Large index files
Hi all
I am using lucene to index a large dataset, it so
: Rupinder Singh Mazara [mailto:[EMAIL PROTECTED]
Sent: Friday, July 23, 2004 5:55 PM
To: Lucene Users List
Subject: Large index files
Hi all
I am using lucene to index a large dataset, it so happens 10% of this data
yields indexes of
400MB, in all likelihood it is possible the index may go upto 7GB.
My
: Re: Large index files
I'm a little confused by this. I thought Lucene keeps creating new files
as the index gets bigger and any single file doesn't ever get all that
big. Is that not the case?
Thanks,
Joel Shellman
John Moylan wrote:
As long as your kernel has Large File Support, then you