> Anyway, these rules take quite a long time (about 90 seconds) to execute on > my OWL_MEM model, which isn't all that big at about 3000 triples including > all the imported models. Does that sound plausible? If so, I'm thinking I > may need to add these inferences in a more targeted way.
In general, as you certainly know, the performance of SPARQL queries strongly depends on the implementation of the engine and whether (or not) it re-orders clauses automatically. It is best to assume that ARQ will execute the clauses (triple matches and filters) from top to bottom, but looking at the SPARQL Debugger in TBC will help you identify the real execution order. In particular FILTER clauses may be re-ordered with significant performance penalty. There is a new option in TBC on the SPARQL preferences tab to force ARQ to leave the FILTER clauses in place, which is IMHO recommended. In SPIN in particular, other factors are important: - if a class has many subclasses with instances, then there might be many individual SPARQL calls for each rule defined on the superclasses. - Each iteration will have a clause to bind ?this with all instances of a class. If these cause performance issues, you can bypass the binding of ?this by either - making rules global (drop ?this and instead put them at owl:Thing or rdfs:Resource) These global rules will be executed only once - use the new SPIN 1.1 property spin:thisUnbound to drop the ?this rdf:type <class> clause, which may theoretically slow down After you run inferences in TBC, the Error Log will contain a list of the slowest queries, together with benchmarks. I hope this helps... let me know if you have ideas on how to improve performance tuning. Regards, Holger -- You received this message because you are subscribed to the Google Groups "TopBraid Composer Users" group. To post to this group, send email to [email protected]. To unsubscribe from this group, send email to [email protected]. For more options, visit this group at http://groups.google.com/group/topbraid-composer-users?hl=en.
