Hi Allison,
I see that you are actually using OWLIM. Have you looked at
http://owlim.ontotext.com/display/OWLIMv43/OWLIM-SE+Usage+Scenarios#OWLIM-SEUsageScenarios-UsingOWLIMSEwithJena
Regards,
Jerven
On 02/03/2012 05:37 PM, Alison Callahan wrote:
Hi again Peter and all,
I did eventually get the JenaSesame code working (I had forgotten to add
VM arguments to the JUnit test to increase the heap size), and was able
to list the statements in the remote Sesame repository that was
connected to a Jena model using
JenaSesame.createModel(repositoryConnection).
However, when I added the Jena model connected to the OWLIM Sesame
repository as a submodel to the model passed into the
SPINInferences.run() method, and ran SPIN inferencing, there were no
inferencing results.
Here is the relevant code:
****************************************
SPINModuleRegistry.get().init();
Model inputModel = ModelFactory.createDefaultModel();
OntModel spinModel = ModelFactory.createOntologyModel(
OntModelSpec.OWL_MEM );
//add system triples
spinModel.add(SystemTriples.getVocabularyModel());
//add SPIN rules
spinModel.read("http://localhost/tmp/rules.spin.rdf");
//register locally defined functions
SPINModuleRegistry.get().registerAll(spinModel, null);
//connect to OWLIM repository
repositoryManager = new
RemoteRepositoryManager("http://localhost:8080/openrdf-sesame");
try {
// Initialise the repository manager
repositoryManager.initialize();
// Get repository
repository = repositoryManager.getRepository("test");
// Open a connection to this repository
repositoryConnection = repository.getConnection();
//create the DatasetGraph instance
//dataset = new SesameDataset(repositoryConnection);
owlimModel = JenaSesame.createModel(repositoryConnection);
} catch (RepositoryException e) {
e.printStackTrace();
} catch (RepositoryConfigException e) {
e.printStackTrace();
}
//add owlim model
spinModel.addSubModel(owlimModel);
//add input model
spinModel.addSubModel(inputModel);
//create output model to contain inferred statements
Model outputModel = ModelFactory.createDefaultModel();
//add output model to SPIN model
spinModel.addSubModel(outputModel);
//run SPIN inferences
SPINInferences.run(spinModel, outputModel, null, null, false, null);
//get number of inferred statements
//System.out.println("Inferred triples: " + outputModel.size());
//get inferred statements
StmtIterator itr = outputModel.listStatements();
System.out.println("Iterating over statements");
while(itr.hasNext()){
Statement s = itr.next();
System.out.println(s.toString());
}
****************************************
So, I need another solution ... are there any working examples of using
the SPIN API to execute SPIN rules over a Jena model that is connected
to a remote Sesame repository? Presumably TopBraid Composer does
something similar in order to manage inferencing over large sets of
triples, but is there a way to reproduce this behaviour using the SPIN API?
Any suggestions are much appreciated!
Alison
On Thu, Feb 2, 2012 at 4:04 PM, Peter Ansell <[email protected]
<mailto:[email protected]>> wrote:
HI Alison,
The GraphRepository that is created by JenaSesame in the background is
definitely a streaming wrapper, using iterators to access the results,
so it is less likely that it is the wrapper causing the issue,
although it could be.
I don't think any of the current JenaSesame tests use listNameSpaces,
so we will need to do some testing to see why that is happening. The
nature of that method is such that Jena itself would need to store all
of the results in memory to make sure the results set is distinct
before emitting any results to free up memory. Have you tested the
performance of some of the other Model methods? Does listSubjects
(also distinct-based so may have the same issue) or listStatements
(should stream results cleanly) give you results?
Note that JenaSesame is setup to use jena-2.7.0-incubating, so it will
not work with SPIN-API just yet per the focus on getting a jena-2.6.4
based release out right now. I have done some experiments with getting
SPIN-API to work with jena-2.7.0-incubating [1] to get around this but
they haven't gone far just yet
Cheers,
Peter
[1] https://github.com/ansell/spin/tree/feature/jena27upgrade
On 3 February 2012 04:05, Alison Callahan <[email protected]
<mailto:[email protected]>> wrote:
> Hi Peter,
>
> Thanks for your help. I have forked your JenaSesame repository
and am trying
> it out with my code.
>
> I have instantiated a Jena model from a remote OWLIM repository using
> JenaSesame.createModel(repositoryConnection), and tried to do a
simple test
> using the model.listNameSpaces() method, with the result that
after 3200
> seconds I got a Java OutOfMemory error. I had given Eclipse 4GB
to work
> with.
>
> All that to say, can you tell me more about how the
> JenaSesame.createModel(RepositoryConnection connection) method
works? Does
> it populate a model with all of the statements in a OWLIM Sesame
repository,
> or does it access them as needed?
>
> Thanks again,
>
> Alison
>
>
> On Tue, Jan 31, 2012 at 6:19 PM, Peter Ansell
<[email protected] <mailto:[email protected]>>
> wrote:
>>
>> Hi Allison,
>>
>> I am not sure if there is the could try the JenaSesame library from
>> either my fork at https://github.com/ansell/JenaSesame or Andy
>> Seaborne https://github.com/afs/JenaSesame . I haven't quite
completed
>> and tested the sesame-jena implementation but when I do I will
send a
>> pull request back to Andys fork, but you shouldn't need to use
>> sesame-jena from what I can tell.
>>
>> The jena-sesame module which should be very similar in
implementation
>> between my fork and Andys, (as opposed to the sesame-jena module)
>> enables the creation and use of either a Jena DatasetGraph or a Jena
>> Model as a wrapper around an existing Sesame
RepositoryConnection. It
>> is very alpha software though, but all of the tests that exist pass.
>> Feel free to fork it and send changes back if you find bugs.
>>
>> Hope that helps,
>>
>> Peter
>>
>> On 1 February 2012 08:17, Alison Callahan
<[email protected] <mailto:[email protected]>>
>> wrote:
>> > Hello all,
>> >
>> > I am using the SPIN API 1.2.0 to execute SPIN rules using the
>> > SPINInferences.run(com.hp.hpl.jena.rdf.model.Model queryModel,
>> >
>> >
com.hp.hpl.jena.rdf.model.Model newTriples,SPINExplanations explanations,
List<SPINStatistics> statistics,
>> > boolean singlePass, ProgressMonitor monito) method. However,
rather than
>> > having to load all of my SPIN rules/linked data/ontologies
into a Jena
>> > model
>> > (quite memory intensive) I would like to instead connect to an
OWLIM
>> > Sesame
>> > repository, as is done by TopBraid Composer.
>> >
>> > I am able to connect to an OWLIM Sesame repository using Jena
using the
>> > Jena
>> > DatasetGraph interface
>> >
>> > (see
http://owlim.ontotext.com/display/OWLIMv43/OWLIM-SE+Installation#OWLIM-SEInstallation-InstantiateOWLIMSEadapterusingtheprovidedAssembler),
>> > but it would defeat the purpose of connecting to an OWLIM Sesame
>> > repository
>> > to then load the DatasetGraph into a Jena model to pass into
>> > SPINInferences.run().
>> >
>> > Can anyone provide suggestions as to how to pass a Jena
DatasetGraph, or
>> > otherwise connect to a Sesame repository such that I can use
it with
>> > SPINInferences.run()?
>> >
>> > Thanks,
>> >
>> > Alison
>> >
>> > --
>> > You received this message because you are subscribed to the Google
>> > Group "TopBraid Suite Users", the topics of which include
Enterprise
>> > Vocabulary Network (EVN), TopBraid Composer,
>> > TopBraid Live, TopBraid Ensemble, SPARQLMotion and SPIN.
>> > To post to this group, send email to
>> > [email protected]
<mailto:[email protected]>
>> > To unsubscribe from this group, send email to
>> > [email protected]
<mailto:topbraid-users%[email protected]>
>> > For more options, visit this group at
>> > http://groups.google.com/group/topbraid-users?hl=en
>>
>> --
>> You received this message because you are subscribed to the Google
>> Group "TopBraid Suite Users", the topics of which include Enterprise
>> Vocabulary Network (EVN), TopBraid Composer,
>> TopBraid Live, TopBraid Ensemble, SPARQLMotion and SPIN.
>> To post to this group, send email to
>> [email protected]
<mailto:[email protected]>
>> To unsubscribe from this group, send email to
>> [email protected]
<mailto:topbraid-users%[email protected]>
>> For more options, visit this group at
>> http://groups.google.com/group/topbraid-users?hl=en
>
>
--
You received this message because you are subscribed to the Google
Group "TopBraid Suite Users", the topics of which include Enterprise
Vocabulary Network (EVN), TopBraid Composer,
TopBraid Live, TopBraid Ensemble, SPARQLMotion and SPIN.
To post to this group, send email to
[email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/topbraid-users?hl=en
--
-------------------------------------------------------------------
Jerven Bolleman [email protected]
Swiss Institute of Bioinformatics Tel: +41 (0)22 379 58 85
CMU, rue Michel Servet 1 Fax: +41 (0)22 379 58 58
1211 Geneve 4,
Switzerland www.isb-sib.ch - www.uniprot.org
Follow us at https://twitter.com/#!/uniprot
-------------------------------------------------------------------
--
You received this message because you are subscribed to the Google
Group "TopBraid Suite Users", the topics of which include Enterprise Vocabulary
Network (EVN), TopBraid Composer,
TopBraid Live, TopBraid Ensemble, SPARQLMotion and SPIN.
To post to this group, send email to
[email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/topbraid-users?hl=en