I don’t have the knowledge to unwrap that trace (the real experts here can
do that) but I’d like to ask: if you haven’t changed any part of the
executing code, did you change the data over which you’re running the
queries at the time the problem appeared, and if so, in what way?
---
A. Soroka
The University of Virginia Library
On Jan 19, 2016, at 11:54 AM, Laurent Rucquoy <
[email protected]> wrote:
Hello,
We have a production server encountering significant slowness resulting
from high CPU usage since about two weeks now.
When we ckeck high CPU usage threads dumps we note that calls to Jena are
always implicated.
The calling code in our application executes a SPARQL query and iterates
on
query solution.
The Jena version used is 2.10.1 (with jena-tdb 0.10.1)
There was recently no change made in our application source code that
could
explain this issue.
Have you any idea about possible causes ?
Thank you in advance for your support.
Sincerely,
Laurent.
Here is a thread dump as an example:
"RMI TCP Connection(17383)-10.249.203.163" daemon prio=6
tid=0x0000000015607800 nid=0x1dd0 waiting for monitor entry
[0x000000001518d000]
java.lang.Thread.State: BLOCKED (on object monitor)
at
com.hp.hpl.jena.tdb.base.block.BlockMgrSync.release(BlockMgrSync.java:76)
- waiting to lock <0x000000072ab58058> (a
com.hp.hpl.jena.tdb.base.block.BlockMgrCache)
at
com.hp.hpl.jena.tdb.transaction.BlockMgrJournal.release(BlockMgrJournal.java:207)
at
com.hp.hpl.jena.tdb.transaction.BlockMgrJournal.release(BlockMgrJournal.java:207)
at
com.hp.hpl.jena.tdb.transaction.BlockMgrJournal.release(BlockMgrJournal.java:207)
at
com.hp.hpl.jena.tdb.transaction.BlockMgrJournal.release(BlockMgrJournal.java:207)
at
com.hp.hpl.jena.tdb.transaction.BlockMgrJournal.release(BlockMgrJournal.java:207)
at
com.hp.hpl.jena.tdb.transaction.BlockMgrJournal.release(BlockMgrJournal.java:207)
at
com.hp.hpl.jena.tdb.base.block.BlockMgrWrapper.release(BlockMgrWrapper.java:77)
at
com.hp.hpl.jena.tdb.base.page.PageBlockMgr.release(PageBlockMgr.java:92)
at
com.hp.hpl.jena.tdb.base.recordbuffer.RecordRangeIterator.close(RecordRangeIterator.java:151)
at
com.hp.hpl.jena.tdb.base.recordbuffer.RecordRangeIterator.hasNext(RecordRangeIterator.java:134)
at org.apache.jena.atlas.iterator.Iter$4.hasNext(Iter.java:293)
at
com.hp.hpl.jena.tdb.sys.DatasetControlMRSW$IteratorCheckNotConcurrent.hasNext(DatasetControlMRSW.java:119)
at org.apache.jena.atlas.iterator.Iter$4.hasNext(Iter.java:293)
at org.apache.jena.atlas.iterator.Iter$3.hasNext(Iter.java:179)
at org.apache.jena.atlas.iterator.Iter.hasNext(Iter.java:906)
at
org.apache.jena.atlas.iterator.RepeatApplyIterator.hasNext(RepeatApplyIterator.java:58)
at
com.hp.hpl.jena.tdb.solver.SolverLib$IterAbortable.hasNext(SolverLib.java:193)
at
org.apache.jena.atlas.iterator.RepeatApplyIterator.hasNext(RepeatApplyIterator.java:46)
at
com.hp.hpl.jena.tdb.solver.SolverLib$IterAbortable.hasNext(SolverLib.java:193)
at org.apache.jena.atlas.iterator.Iter$4.hasNext(Iter.java:293)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIterPlainWrapper.hasNextBinding(QueryIterPlainWrapper.java:54)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:112)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIterConvert.hasNextBinding(QueryIterConvert.java:59)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:112)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorWrapper.hasNextBinding(QueryIteratorWrapper.java:40)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:112)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorWrapper.hasNextBinding(QueryIteratorWrapper.java:40)
at
com.hp.hpl.jena.sparql.engine.iterator.QueryIteratorBase.hasNext(QueryIteratorBase.java:112)
at
com.hp.hpl.jena.sparql.engine.ResultSetStream.hasNext(ResultSetStream.java:75)
at
com.telemis.core.measure.server.rdf.MeasureRetrieveRdf.getMeasuresFromSeriesIds(MeasureRetrieveRdf.java:138)
at
com.telemis.core.measure.server.rdf.MeasureRetrieveRdf.getMeasuresFromSeries(MeasureRetrieveRdf.java:183)
at
telemis.measure.tms.service.MeasureService.getMeasuresFromSeries(MeasureService.java:458)
at
telemis.measure.tms.service.MeasureService.getMeasuresFromExam(MeasureService.java:436)
at
telemis.measure.tms.messagehandlers.GetMeasuresFromExamMessageHandler.perform(GetMeasuresFromExamMessageHandler.java:46)
at
telemis.measure.tms.messagehandlers.GetMeasuresFromExamMessageHandler.perform(GetMeasuresFromExamMessageHandler.java:26)
at
telemis.service.MessageHandlerManager.execute(MessageHandlerManager.java:50)
at telemis.service.MomoRMIImpl.executeInternal(MomoRMIImpl.java:522)
at telemis.service.MomoRMIImpl.execute(MomoRMIImpl.java:367)
at telemis.service.MomoRMIImpl_Skel.dispatch(Unknown Source)
at sun.rmi.server.UnicastServerRef.oldDispatch(Unknown Source)
at sun.rmi.server.UnicastServerRef.dispatch(Unknown Source)
at sun.rmi.transport.Transport$1.run(Unknown Source)
at sun.rmi.transport.Transport$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at sun.rmi.transport.Transport.serviceCall(Unknown Source)
at sun.rmi.transport.tcp.TCPTransport.handleMessages(Unknown Source)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(Unknown
Source)
at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(Unknown
Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Locked ownable synchronizers:
- <0x0000000737fcdfe8> (a java.util.concurrent.ThreadPoolExecutor$Worker)