On 30/08/12 13:35, Yoga Indra Pamungkas wrote:


The details depend slightly on the sort of inference configuration you
are using. Assuming you are using one of the built in RDFS/OWL rule 
configurations
then you presumably have an InfModel (or an OntModel) which includes the
annotation data, ontology and deductions. The brute force way to store those in 
a database backed model is: // appropriate locking or transaction creation 
datamodel.add( infModel ); // appropriate unlocking or transaction commit 
However, this will add a copy of the ontology and all of the inferences
to your datamodel.


Well this is my code :

Resource johnSmith = m.createResource(thesisURI).addProperty(RDF.type, 
OntologySkripsiKomputasi.aplikasiDesktop);

Model schema = FileManager.get().loadModel(RDF_FILE);
Reasoner reasoner = ReasonerRegistry.getOWLReasoner();
reasoner = reasoner.bindSchema(schema);

InfModel infmodel = ModelFactory.createInfModel(reasoner, model);

Resource yoga = infmodel.getResource(skripsiURI);

System.out.println("thesisURI Inference :");

printStatements(infmodel, yoga, null, null);

 From here, I've got the inference. Now how I store the printStatements, or the 
inference into database?

You could do something like:

   // appropriate locking or transaction creation
   datamodel.add( yoga.listProperties() );
   // appropriate unlocking or transaction commit

In the part of my previous message that you didn't quote I showed a more general version of the above but if you always have some single root resource and all you want is the augmented properties of that root resource then the above would suffice.

 From the point of view of doing the inference there's no advantage to
storing the ontology in the database. You are better off using an
in-memory copy which can as easily be initialized from a file.

However, depending on the rest of your application it is often useful to
*also* put the ontology in the data store so that it can be referenced
in queries.

Base on My Code above, I've got confused finding the difference between 
Inference with OntModel and using CreateInfModel.

OntModels provide extra convenience API calls for manipulating ontologies. An OntModel is also an InfModel (the OntModel interface extends the InfModel interface).

So what's the best from your point of view, if I just want to infer from 
statement? Should I infer the ontology using OntModel and save it to DB?

If all you want is an inference closure for some simple instance data and you are not otherwise using the OntModel API then sticking with your current approach is fine.

Dave

Reply via email to