Hello Holger,
Thanks a lot for your efforts, I really appreciate them.
I've been trying to avoid the SPIN RDF syntax within my ontologies, but of
course imported third-party ontologies such as SPL also need to be
modified, which I'd rather avoid.
But your quick fix of removing the blank nodes is actually okay for my use
case, so I'm happy I can proceed like this. I can see that the filter
applied by TBC is a little more subtle (i.e. it doesn't filter out blank
nodes connected to a "non-builtin" node), something which I can easily
reproduce within my java application.
So for me this problem is solved, many thanks.
Kind regards,
Wim
Op zondag 4 januari 2015 04:02:33 UTC+1 schreef Holger Knublauch:
>
> Hi Wim,
>
> (with apologies for the delay, I hope you had a great holiday season too).
>
> I still think the best solution is to avoid the SPIN RDF syntax in such
> cases where you sequence rules and constraint checking, and inferences
> modify the constraints to execute. The class SPTextUtil can be used to
> convert between the two formats. For the next generation of SPIN that we
> are currently proposing to the RDF Shapes working group, this problem has
> been resolved by dropping the SPIN RDF syntax altogether - if you only have
> textual query strings then no such messy interactions can happen.
>
> But here is another suggestion. If your inferences are only about URI
> resources and not blank nodes, you can post-process the inferences so that
> no blank nodes get modified. Do the following after you run inferences
>
> // Remove inferences on blank nodes
> for(Statement s : newTriples.listStatements().toList()) {
> if(s.getSubject().isAnon()) {
> newTriples.remove(s);
> }
> }
>
> TopBraid honors the spin:imports triples (while the SPIN API does not by
> default). You can mimic that behavior by building separate union graphs as
> shown in the OWLRLExample.java:
>
> // Collect rules (and template calls) defined in OWL RL
> Map<Resource,List<CommandWrapper>> cls2Query =
> SPINQueryFinder.getClass2QueryMap(unionModel, queryModel, SPIN.rule, true,
> false);
> Map<Resource,List<CommandWrapper>> cls2Constructor =
> SPINQueryFinder.getClass2QueryMap(queryModel, queryModel, SPIN.constructor,
> true, false);
> SPINRuleComparator comparator = new
> DefaultSPINRuleComparator(queryModel);
>
> but that's tricky and even then there may still be interference between
> the inferences and the SPIN RDF syntax.
>
> HTH
> Holger
>
>
> On 12/25/2014 0:57, Wim wrote:
>
> Hello Holger,
>
> Sorry for keeping this problem alive, but after many hours of trying I
> still don't see a good solution.
>
> I think splitting the rules from the schema would be overly complex since
> first of all, I'd need to split over 20+ ontologies. And secondly the java
> script would need to loop over the iterations, so that *at each iteration*
> I can *first* import/run/add the OWL-RL inferences and *then*
> import/run/add the custom inferences
>
> I now think the clue is that running the inferences with TBC apparently
> only generates triples related to the "custom" ontologies. So TBC doesn't
> "break" ontologies such as SPL, even if they import OWL-RL rules. If I can
> mimic this behaviour, my problem is solved much more cleanly.
>
> So could you please advise on how I can achieve this? Such a simply filter
> the inferred triples (based on what criteria)?
>
> It also seems that "spin:imports" is ignored by spin-api, but is taken
> into account by TBC. Is this correct behaviour?
>
> Thanks a lot, again...
> (and you're welcome to answer after the holidays, of course)
>
> Wim
>
>
> Op woensdag 24 december 2014 10:00:56 UTC+1 schreef Holger Knublauch:
>
> Hi Wim,
>
> converting the SPIN RDF to sp:text should certainly help in any case.
>
> Alternatively you may also want to look into separating the SPIN
> definitions from the schema, into different graphs (files) because you can
> this way control which graphs are visible for inferencing vs constraint
> checking. The OWL RL rules don't need to run over the SPIN triples, so you
> could make them invisible at this stage (which will also give you less
> uninteresting inferences).
>
> The idea with your own root class may only work if the SPIN triples are in
> a separate graph.
>
> Good luck
> Holger
>
>
>
> On 12/24/2014 17:56, Wim wrote:
>
> Hello Holger,
>
> Thanks a lot for your efforts.
>
> However my constraints must be checked against the inferred triples also
> (just like the Kennedys example), so removing the inferred triples before
> checking the constraints isn't an option.
>
> I have full control over my ontologies, so I can modify them if needed. Do
> you think the following would work: create a "root" superclass of all my
> custom classes, and attach the owlrl rules to this class (instead of
> owl:Thing, like owlrl-all does)? I guess any SPIN rules or constraints or
> SPL functions would be untouched this way, since they're not instances of a
> (subClassOf*) this root class.
>
> In TBC I can also check "generate sp:text of queries and no SPIN RDF
> triples", but I guess this won't be needed anymore if I apply the above
> solution.
>
> Do you think it's a good idea to modify my ontologies this way, or do you
> see other problems (or solutions)?
>
> Thanks a lot for advising,
>
> Wim
>
>
> Op woensdag 24 december 2014 05:07:33 UTC+1 schreef Holger Knublauch:
>
> Hi Wim,
>
> wow that was a tricky one... The problem was that when you run OWL-RL
> rules in that context graph, it will add inferences over the SPIN RDF
> triples themselves, and thus modify the SPARQL behind the constraints!
> Before running constraints you need to add
>
> // Run all constraints
> ontModel.removeSubModel(newTriples);
>
> to make sure that the OWL RL inferences do not interfere with the SPIN
> vocabulary itself.
>
> I am moving to text-based SPARQL representation, bypassing the SPIN RDF
> triple notation, and the problem here is one instance where it is better to
> stay with the textual syntax only.
>
> Thanks for your patience on that matter,
> Holger
>
>
> On 12/23/2014 23:42, Wim wrote:
>
> Hello Holger,
>
> Just an addendum to my previous post: in your example code, you're adding
> "http://topbraid.org/spin/owlrl" <http://topbraid.org/spin/owlrl> instead
> of "http://topbraid.org/spin/owlrl-all" <http://topbraid.org/spin/owlrl>.
> If I try your code with owlrl-all, the problem (bug?) also appears.
>
> Thanks,
>
> Wim
>
>
>
> Op maandag 22 december 2014 01:15:46 UTC+1 schreef Holger Knublauch:
>
>
>
> On 12/19/2014 23:06, Wim wrote:
>
> Hi Mark,
>
> thanks for your time.
>
> In fact I've tracked down the problem using the examples from the SPIN
> distribution: the problem seems to occur because my ontologies are
> importing both owrl-all and spl.
> You can verify this easily: download the KennedysSPIN ontology and add :
>
> <owl:imports rdf:resource="http://spinrdf.org/spl"
> <http://spinrdf.org/spl>/>
> <owl:imports rdf:resource="http://topbraid.org/spin/owlrl-all"
> <http://topbraid.org/spin/owlrl-all>/>
>
> If you now run KennedysInferencingAndConstraintsExample.java (modified so
> it reads the local KennedysSPIN file), it will fail.
>
>
> Hi Wim,
>
> I have not been able to reproduce this problem yet. The error looks like
> it failed to parse some SPARQL query. The online version of owlrl uses the
> text-based SPIN syntax with sp:text, and in order to parse those, it needs
> certain prefixes to be visible. However, even if I make the modifications
> you suggest, it still seems to work OK. Instead of modifying the
> kennedysSPIN file, I have added the owl:imports triples manually (which
> should have the same effect):
>
> public class KennedysInferencingAndConstraintsExample {
>
> public static void main(String[] args) {
>
> // Initialize system functions and templates
> SPINModuleRegistry.get().init();
>
> // Load main file
> Model baseModel = ModelFactory.createDefaultModel();
> baseModel.read("http://topbraid.org/examples/kennedysSPIN"
> <http://topbraid.org/examples/kennedysSPIN>);
> baseModel.add(ResourceFactory.createResource(
> "http://topbraid.org/examples/kennedysSPIN"
> <http://topbraid.org/examples/kennedysSPIN>),
> OWL.imports,
> ResourceFactory.createResource(
> "http://topbraid.org/spin/owlrl" <http://topbraid.org/spin/owlrl>));
> baseModel.add(ResourceFactory.createResource(
> "http://topbraid.org/examples/kennedysSPIN"
> <http://topbraid.org/examples/kennedysSPIN>),
> OWL.imports,
> ResourceFactory.createResource(SPL.BASE_URI));
>
> // Create OntModel with imports
> ...
>
> Despite this change, it still works fine. Could you try to step-by-step
> narrow down what is different in your scenario? I can see you use an
> ontpolicy file, which is not used by the default example code.
>
> Thanks
> Holger
>
>
>
> Is this a bug? Would there be a simple workaround? Any idea why this only
> fails using the SPIN API and not in TBC?
>
> Thanks a lot for your help,
>
> Wim
>
>
>
> Op donderdag 18 december 2014 21:35:50 UTC+1 schreef Mark Graham:
>
> Hi Wim,
>
> Thanks for choosing our TopBraid product. Our development team has
> reviewed this issue and here are the recommended steps to address your
> question.
>
> Please use the code from
> http://www.topquadrant.com/repository/spin/org/topbraid/spin/1.4.0/ to do
> the following steps -
> - Run inferences(org.topbraid.spin.inference.SPINInferences.java) on your
> model.
> - Add the triples generated from the inferences to a model.
> - Run the constraints(org.topbraid.spin.constraints.SPINConstraints.java)
> on the model containing the SPIN inferences.
>
>
>
> Thanks,
> Mark
>
>
> Mark Graham
> TopQuadrant Support
>
>
>
> On Thu, Dec 18, 2014 at 8:20 AM, Wim <[email protected]> wrote:
>
> Hello,
>
> I'm trying to do the following:
> 1) read an ontology (and all its imports), generate all inferences
> 2) show all constraint violations.
>
> With TBC SE, I can do this without a problem: I just open the ontology,
> generate the inferences, and check the constraint problems.
>
> However if I do the same with the SPIN API, only the first step succeeds.
> SPINConstraints.check throws a QueryParseException. See full stack trace:
> http://pastebin.com/r85LbAVg
>
> This is my code, based on your examples:
> http://pastebin.com/pvhA0Q1n
>
> Strangely, if I first use TBC to export everything (all ontologies +
> inference graph) into a single file, and apply the CheckConstraints.java
> example, it works as expected.
> However if I use the RunInferences.java example to generate the inferences
> and store everything in one file (as is done in my code mentioned above,
> after the exception is thrown), the CheckConstraints.java example fails
> again. There's a little more diagnostics now, apparently it fails for some
> "spl" function: http://pastebin.com/WeXfhPa6
> If I remove this spl function, it fails on another spl function, and so
> on.
>
> I use an ont-policy.rdf file to control which
> spl.spin.rdf/spin.rdf/sp.rfd/... files are used by the script. I've tried
> to use the RDF files that can be resolved online, those that came with TBC,
> and those that are part of the spinapi jar file. All show the same error.
>
> To run the inferences and constraint checks I will continue to use TBC,
> but I'd also like to do this in an automated way via SPIN API.
>
> Any idea what's going wrong?
>
> Thanks a lot for your help,
>
> Wim
> --
> -- You received this message because you are subscribed to the Google
> Group "TopBraid Suite Users", the topics of which include Enterprise
> Vocabulary Network (EVN), TopBraid Composer, TopBraid Live, TopBraid
> Insight, SPARQLMotion, SPARQL Web Pages and SPIN.
> To post to this group, send email to
> [email protected]
> To unsubscribe from this group, send email to
> [email protected]
> For more options, visit this group at
> http://groups.google.com/group/topbraid-users?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "TopBraid Suite Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>
> --
> -- You received this message because you are subscribed to the Google
> Group "TopBraid Suite Users", the topics of which include Enterprise
> Vocabulary Network (EVN), TopBraid Composer, TopBraid Live, TopBraid
> Insight, SPARQLMotion, SPARQL Web Pages and SPIN.
> To post to this group, send email to
> [email protected]
> To unsubscribe from this group, send email to
> [email protected]
> For more options, visit this group at
> http://groups.google.com/group/topbraid-users?hl=en
> ---
> You received this message because you are subscribed to the Google Groups
> "TopBraid Suite Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to [email protected].
> For more options, visit https://groups.google.com/d/optout.
>
>
> --
> -- You received this message because you are subscribed to the Goo
>
> ...
--
-- You received this message because you are subscribed to the Google
Group "TopBraid Suite Users", the topics of which include Enterprise Vocabulary
Network (EVN), TopBraid Composer, TopBraid Live, TopBraid Insight,
SPARQLMotion, SPARQL Web Pages and SPIN.
To post to this group, send email to
[email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/topbraid-users?hl=en
---
You received this message because you are subscribed to the Google Groups
"TopBraid Suite Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
For more options, visit https://groups.google.com/d/optout.