This is probably easier

https://s3-ap-southeast-2.amazonaws.com/kamal-documents/KamalDisk_1/homes/kamal/CloudStation/DB.ttl


On 14 Jan 2015, at 9:00 pm, Kamalraj Jairam 
<[email protected]<mailto:[email protected]>> wrote:

Hi Rob,

See if you can access this link

https://drive.google.com/open?id=0B_OTAjv2l9fOekNPaDlMUVBFbjQ&authuser=1

Thanks
Kamalraj


On 14 Jan 2015, at 8:45 pm, Rob Vesse 
<[email protected]<mailto:[email protected]>> wrote:

Attachments are filtered from the list, please copy and paste the data into
an email or provide a link to the data on a file sharing service of your
choice

Rob

From:  Kamalraj Jairam 
<[email protected]<mailto:[email protected]>>
Reply-To:  <[email protected]<mailto:[email protected]>>
Date:  Wednesday, 14 January 2015 06:03
To:  "[email protected]<mailto:[email protected]>" 
<[email protected]<mailto:[email protected]>>
Subject:  Inferencing

Hello All,

I have attached a small ontology which takes ages to create InfModel on




even the pellet reasoner takes 10 minutes to do anything

Any help would be good .

I have attached the Unit test as well


@Test
public void reasoningTest(){

  ontModel = ModelFactory.createOntologyModel();
  data = ModelFactory.createDefaultModel();

  Model model =
RDFDataMgr.loadModel("/Users/kamalrajjairam/Downloads/DB.ttl", Lang.TURTLE);

  ontModel.add(model);

  Reasoner reasoner = PelletReasonerFactory.THE_SPEC.getReasoner();

  OntClass classa = ontModel.createClass("http://test.com#A";);

  Resource instanceA = data.createResource("http://test.com#A1";);

  Statement s = new StatementImpl(instanceA, RDF.type, classa);

  data.add(s);

  reasoner = reasoner.bindSchema(ontModel);

  InfModel infModel = ModelFactory.createInfModel(reasoner, data);

  infModel.add(ontModel);

  ByteArrayOutputStream namedSchemaStream = new ByteArrayOutputStream();

  infModel.write(namedSchemaStream, "TURTLE");

  infModel.getNsPrefixMap();
}
thanks
Kamalraj




Reply via email to