Hi Dave,

When you create an OntModel you can pass in a base model.
So in this case just pass in your "model" (though the dataset will need to be open still).

Your choice of OntModelSpec to pass to the OntModel constructor will determine whether any inference is enabled or not.

Bewarned that inference over a TDB-backed model will be even slower than inference over a memory-backed model. For this reason a common pattern is to compute the inferences you want, in memory, and then add those as a graph in the TDB and use a non-inference model to access the union. Precise details vary widely according to what you are trying to do.

Dave


On 30/04/12 15:19, DAVID PATTERSON wrote:

For now, I'm successfully building a plain model, reading a set of .ttl
files into it, creating a TDB database and using it with Fuseki.

I'd like to try using an OntModel or InfModel to start getting some
additional entailments in the data.

My current code is like:

Dataset ds = TDBFactory.createDataset( newLocation ) ; // a string
Model model = ds.getDefaultModel() ;

RDFReader rdfRdr = model.getReader( "TTL" );
for ( String fn : files )
{
InputStream dataStream = new FileInputStream( fn );
rdfRdr.read( model, dataStream, "" );
dataStream.close();
}
// Close the dataset.
ds.close();

What should I do to get a more powerful model?

Thanks.

Dave Patterson

Reply via email to