David Spencer wrote:

JiÅÃ Kuhn wrote:

This doesn't work either!


You're right.
I'm running under JDK1.5 and trying larger values for -Xmx and it still fails.


Running under (Borlands) OptimzeIt shows the number of Terms and Terminfos (both in org.apache.lucene.index) increase every time thru the loop, by several hundred instances each.

Yes... I'm running into a similar situation on JDK 1.4.2 with Lucene 1.3... I used the JMP debugger and all my memory is taken by Terms and TermInfo...


I can trace thru some Term instances on the reference graph of OptimizeIt but it's unclear to me what's right. One *guess* is that maybe the WeakHashMap in either SegmentReader or FieldCacheImpl is the problem.

Kevin

--

Please reply using PGP.

http://peerfear.org/pubkey.asc NewsMonster - http://www.newsmonster.org/
Kevin A. Burton, Location - San Francisco, CA, Cell - 415.595.9965
AIM/YIM - sfburtonator, Web - http://peerfear.org/
GPG fingerprint: 5FB2 F3E2 760E 70A8 6174 D393 E84D 8D04 99F1 4412
IRC - freenode.net #infoanarchy | #p2p-hackers | #newsmonster



--------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]



Reply via email to