Hi there,

If you query the dataset directly, do you see any triples? You will see different results if you query via the inference model. The inference is not done in the database but in the inference engine associate with the OntModel. The dtabase does not contain the deductions.

You can store the inferred results to a database by writing the whole OntModel to a graph in the database.

        Andy

On 10/10/13 13:21, Dibyanshu Jaiswal wrote:
Here is a sample code of the above stated problem.

public class ReadTDB {

     public static void main(String[] args){

     // open TDB dataset
     String directory = "./MyDatabases/OntologyTDB"; // My TDB created
beforehand
     Dataset dataset = TDBFactory.createDataset(directory);
     //Read Model from the tdb
     Model tdb = dataset.getDefaultModel();

     // read the OntModel from the Model
     OntModel m = ModelFactory.createOntologyModel( OntModelSpec.OWL_MEM,
tdb );

     String sparqlQueryString=null;

      sparqlQueryString = "SELECT ?s WHERE { ?s <
http://www.w3.org/2000/01/rdf-schema#subClassOf> <
http://www.owl-ontologies.com/unnamed.owl#ABCD> }";


     Query query = QueryFactory.create(sparqlQueryString) ;
     QueryExecution qexec = QueryExecutionFactory.create(query, m) ;  //LINE
OF CONSIDERATION
     ResultSet results = qexec.execSelect() ;

     ResultSetFormatter.out(results) ;

     tdb.close();
     dataset.close();

     }

}




As per the above code snippet given, and the line marked as "LINE OF
CONSIDERATION", when i pass the OntModel m as the parameter the results are
in accordence to the inference mechanisms (such as Transitive relations)
but if I change the parameter to dataset i.e.
QueryExecution qexec = QueryExecutionFactory.create(query, dataset) ;
and hence execute the query, the results are not the same.
As per my observation the a Query made to the Dataset/TDB directly is
unable to provide inference mechanisms as provided by the OntModel, even
when the TDB creation is done as follows:

public static OntModel createTDBFromOWL(){

         Dataset dataset =
TDBFactory.createDataset("./MyDatabases/OntologyTDB") ;
         Model m = dataset.getDefaultModel();
         OntModel om =ModelFactory.createOntologyModel(
OntModelSpec.OWL_MEM_RULE_INF, m );
         FileManager.get().readModel( om,"./OWLs/MyOWLfile.owl");
         return om;

     }







Is there some way to create a Dataset object, which is Inference enabled,
similar to creation of an OntModel like:
   OntModel om =ModelFactory.createOntologyModel(
OntModelSpec.OWL_MEM_RULE_INF, m );
so that the dataset supports inferenceing mechanisms?


On Thu, Oct 10, 2013 at 3:51 PM, Andy Seaborne <[email protected]> wrote:

On 10/10/13 10:12, Dibyanshu Jaiswal wrote:

Hi!

I am new to semantic web technologies, and have started with RDF/OWL for
making web applications.
Currently i have a requirement for accessing a Ontology (OntModel with
OWLModelSpec.OWL_MEM_RULE_INF) from an OWL file.I am also able to Store in
local TDB, all done by JENA 2.11.0. Thanks to the Nice API and Tutorial
provided for the same.
I need to fire SPARQL queries on the model to get some fruitful results.
Once the TDB is created, in order to fire search query on the same, the
results are not as expected.

As per my SPARQL qurey, when directly made to the TDB Dataset, does not
returns results (i.e. results with inference rules) accordingly. Where as
If the same query is fired on the OntModel (loaded from the TDB, with
OWLModelSpec.OWL_MEM_RULE_INF ) itself the results are found to be as
expected.


Generally, showing the details of what you are doing makes it easier for
people to provide answers.  The details matter :-)



How do I solve the problem of making queries directly to the Dataset and
not to OntModel with inference rule enabled?


Inference is a characteristic of the model (in RDF inference is within
models/graphs, not between graphs).

You need to create an ont model backed by a graph from the TDB store.

         Andy


  Please help!!
Thanks in Advance!!






Reply via email to