On 01/06/15 10:42, Diana Magdi wrote:
Hi,

I have an large OWL ontology file as RDF which includes swrl rules and i'm
trying to reason in java with jena TDB using Pellet but when i run the
following code on eclipse.

[snip]

i get this exception:


Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

[snip]

What am I doing incorrect ? Thanks

Pellet is not part of Apache Jena so you will need to ask the Pellet folks for support.

However, as far as I know, Pellet will build all its inference structures in-memory so running over TDB will just result in slower performance it won't give you increased scalability of reasoning.

The basic problem is that you don't have enough memory for inference over the scale/complexity of your ontology. If you have already allocated Java all the memory you reasonably can [*] then you may not be able to use Pellet in this application without some change to your ontology/rules.

Dave

[*] There is a trade-off here. For TDB you want to leave a decent amount of memory to the system so that it can use it for disk buffers. For large in-memory inference structures you need to allocate memory to the java process.


Reply via email to