Andy, (I have just taken over the respective code on my end, so please bear with me.)
Yes, it does appear that we are not using the API calls very well... I think the first issue is that we are creating instances of DatasetGraphTDB (actually an extension thereof), which does not look to be supported through the API. These are both created and released by a CachingTDBMaker. However, at the moment I am not sure that we are properly releasing them at the end of each test. I'll do some more digging and ask if I have further questions. Thanks. Cheers, Willie -----Original Message----- From: Andy Seaborne [mailto:[email protected]] On Behalf Of Andy Seaborne Sent: Friday, July 27, 2012 15:47 PM To: [email protected] Subject: Re: TDB exceeding java heap space Hi there, > com.hp.hpl.jena.tdb.sys.CachingTDBMaker.createDatasetGraph(CachingTDBMaker .java:46) at org.topbraid.jenax.tdb.TBDatasetGraphMakerTDB.createDatasetGraph(TBDataset GraphMakerTDB.java:152) > What's the full lifecycle of the test? In your tests, how were you releasing resources after a test has run when running with TDB 0.8.9? (Assuming GC is not enough). BTW These implementation classes are likely to be removed as all datasets are now created and managed by the transaction subsystem. You seem to be calling directly into implementation classes - it is better to use the API. But as you are asking for a fresh, in-memory TDB store each time, The API calls are: TDBFactory.createDatasetGraph TDBFactory.release but remember "release" is forceful - it assumes you have closed all transactions (or finished other actions if not used in a transaction) and will evict the dataset regardless. Named memory locations are now provided. Andy On 27/07/12 20:04, Willie Milnor wrote: > Hi, > > > > We have a number of automated tests which perform various operations > on a graph stored in TDB. When we were using using TDB 0.8.9, we had > no memory issue running the tests. We are now using 0.9.2 and we are > getting OutOfMemory errors while running the same tests and allocating > the same memory (see the stack trace below). Even if we double the > heap space size, we still see the errors. Is there something > different we should be doing with the updated version of the TDB library? > > > > java.lang.OutOfMemoryError: Java heap space > > at java.util.HashMap.<init>(Unknown Source) > > at java.util.LinkedHashMap.<init>(Unknown Source) > > at > org.openjena.atlas.lib.cache.CacheImpl.<init>(CacheImpl.java:45) > > at > org.openjena.atlas.lib.cache.CacheLRU.<init>(CacheLRU.java:37) > > at > org.openjena.atlas.lib.CacheFactory.createCache(CacheFactory.java:40) > > at > org.openjena.atlas.lib.CacheFactory.createCache(CacheFactory.java:31) > > at > com.hp.hpl.jena.tdb.nodetable.NodeTableCache.<init>(NodeTableCache.jav > a:65) > > at > com.hp.hpl.jena.tdb.nodetable.NodeTableCache.create(NodeTableCache.jav > a:56) > > at > com.hp.hpl.jena.tdb.setup.Builder$NodeTableBuilderStd.buildNodeTable(B > uilder.java:88) > > at > com.hp.hpl.jena.tdb.setup.DatasetBuilderStd$NodeTableBuilderRecorder.b > uildNodeTable(DatasetBuilderStd.java:387) > > at > com.hp.hpl.jena.tdb.setup.DatasetBuilderStd.makeNodeTable(DatasetBuild > erStd.java:298) > > at > com.hp.hpl.jena.tdb.setup.DatasetBuilderStd._build(DatasetBuilderStd.j > ava:165) > > at > com.hp.hpl.jena.tdb.setup.DatasetBuilderStd.build(DatasetBuilderStd.ja > va:155) > > at > com.hp.hpl.jena.tdb.setup.DatasetBuilderStd.build(DatasetBuilderStd.ja > va:68) > > at > com.hp.hpl.jena.tdb.sys.DatasetGraphSetup.createDatasetGraph(DatasetGr > aphSetup.java:32) > > at > com.hp.hpl.jena.tdb.sys.CachingTDBMaker.createDatasetGraph(CachingTDBM > aker.java:46) > > at > org.topbraid.jenax.tdb.TBDatasetGraphMakerTDB.createDatasetGraph(TBDat > asetGraphMakerTDB.java:152) > > at > org.topbraid.jenax.tdb.TBDatasetGraphMakerTDB.createDatasetGraphTDB(TB > DatasetGraphMakerTDB.java:31) > > at > org.topbraid.tdb.TestTDBBufferingGraph.getDelegatingGraph(TestTDBBuffe > ringGraph.java:38) > > at > org.topbraid.core.graph.AbstractTestDelegatingGraph.getGraph(AbstractT > estDelegatingGraph.java:178) > > at > org.topbraid.core.graph.AbstractTestGraph.testBulkUpdate(AbstractTestG > raph.java:281) > > at > sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > > at > sun.reflect.NativeMethodAccessorImpl.invoke(Unknown > Source) > > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown > Source) > > at java.lang.reflect.Method.invoke(Unknown Source) > > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkM > ethod.java:44) > > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCall > able.java:15) > > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMet > hod.java:41) > > at > org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMeth > od.java:20) > > at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.jav > a:31) > > at > org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4Clas > sRunner.java:79) > > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunn > er.java:71) > > > > > > Willie Milnor > > Sr. Semantic Solutions Developer > > TopQuadrant, Inc. > > 330 John Carlyle Street > > Suite 180 > > Alexandria, VA 22314 > > 703.299.9330 > > www.topquadrant.com > > > > Cell: 410.971.7788 >
