>

The details depend slightly on the sort of inference configuration you 
are using. Assuming you are using one of the built in RDFS/OWL rule 
configurations 
then you presumably have an InfModel (or an OntModel) which includes the 
annotation data, ontology and deductions. The brute force way to store those in 
a database backed model is: // appropriate locking or transaction creation 
datamodel.add( infModel ); // appropriate unlocking or transaction commit 
However, this will add a copy of the ontology and all of the inferences 
to your datamodel.


>>> Well this is my code :

Resource johnSmith = m.createResource(thesisURI).addProperty(RDF.type, 
OntologySkripsiKomputasi.aplikasiDesktop);

Model schema = FileManager.get().loadModel(RDF_FILE);
Reasoner reasoner = ReasonerRegistry.getOWLReasoner();
reasoner = reasoner.bindSchema(schema);

InfModel infmodel = ModelFactory.createInfModel(reasoner, model);

Resource yoga = infmodel.getResource(skripsiURI);

System.out.println("thesisURI Inference :");

printStatements(infmodel, yoga, null, null);

From here, I've got the inference. Now how I store the printStatements, or the 
inference into database?


>
From the point of view of doing the inference there's no advantage to 
storing the ontology in the database. You are better off using an 
in-memory copy which can as easily be initialized from a file.

However, depending on the rest of your application it is often useful to 
*also* put the ontology in the data store so that it can be referenced 
in queries.

>>> Base on My Code above, I've got confused finding the difference between 
>>> Inference with OntModel and using CreateInfModel.
So what's the best from your point of view, if I just want to infer from 
statement? Should I infer the ontology using OntModel and save it to DB?

Thx for your help Dave.

Reply via email to