There is a thread currently on the Lucene user list titled 'Opening up one large index takes 940M or memory' that may bring some answers to your problem.
There may also be ways to increase the amount of memory gcj gets to use, I would look on the [EMAIL PROTECTED] mailing list archives for this.
Andi..
On Tue, 15 Feb 2005, Yura Smolsky wrote:
Hello, Andi.
I have index of 51Gb. When I try to create IndexSearcher with FSDirectory of this index, I receive exception:
GC Warning: Out of Memory! Returning NIL! Traceback (most recent call last): File "./RemoteSearcherServer.py", line 275, in ? server.processCommandLine(sys.argv) File "/home/search/lib/myCORBA.py", line 81, in processCommandLine self.start() File "/home/search/lib/myCORBA.py", line 51, in start self.setUp() File "./RemoteSearcherServer.py", line 240, in setUp self.objServant = RemoteSearcher_i(num, path, luceneSearcher) File "./RemoteSearcherServer.py", line 132, in __init__ luceneSearcher._reloadIndex() File "./RemoteSearcherServer.py", line 18, in _reloadIndex self.searcher = IndexSearcher(self.directory) File "/usr/lib/python2.4/site-packages/PyLucene.py", line 2266, in __init__ newobj = _PyLucene.new_IndexSearcher(*args) ValueError: java.lang.OutOfMemoryError <<No stacktrace available>>
Why?
Yura Smolsky.
_______________________________________________ pylucene-dev mailing list [email protected] http://lists.osafoundation.org/mailman/listinfo/pylucene-dev
