On 09/01/15 22:04, Kamalraj Jairam wrote:
Hello Dave and everyone

I have run into one more issue

I have a class “A” and “B” in my ontology for which i have added an equivalent class 
from “Schema.org<http://Schema.org>” and “DBPedia” ontologies (This is to 
provide external context).

Now when i run the reasoner to inter data against my ontology (using OWLMINI or 
OWLMICRO”), takes a very long time to produce results.

So, i started using Pellet to reason my ontologies, but pellet doesn’t reason 
unless i put ontology and the data in the same model .

1) How can i improve the speed of OWLMINI and OWLMICRO to reason DBPEDIA and 
Schema.org<http://Schema.org>

Don't think you can easily, equivalences are expensive especially for the rule reasoner. The only option is to cut down the fractions of the ontologies that you include or switch to a reasoner like Pellet.

2) Why wouldn’t the following statement work for Pellet ?

Reasoner reasoner = ontModelSpec.getReasoner();

    Reasoner boundReasoner = reasoner.bindSchema(ontModel);
    infModel = ModelFactory.createInfModel(boundReasoner, model);

my infidel does not have inferred statements if i use pellet

[Aside: "infidel" was a great typo :)]

Don't know, you would have to ask the Pellet folks. Perhaps bindSchema isn't fully supported. That would be reasonable since I doubt there's any partial evaluation that Pellet could do at that stage.

Your alternative is to create a union model (e.g. an OntModel over the base model which imports the ontology, or manually create a dynamic union model of base and ontology). Then you can call createInfModel over that union and omit the step of generating a boundReasoner.

Dave

Reply via email to