Hmm… I am not sure how my rules are modeled. I just use the built-in 
OWL_MEM_MICRO_RULE_INF OntModelSpec.

Anyway, my question is still this: how do I get all of those inferences 
computed *before* I start querying the Model. It’s great if I can just store 
them later, but I still need to *compute* them before I can think about 
persisting anything. Running a single query doesn’t seem to compute them all, 
just relevant ones to that specific query… I think?

> Den 2. jul. 2021 kl. 14.06 skrev Lorenz Buehmann 
> <buehm...@informatik.uni-leipzig.de>:
> 
> But can't you do this inference just once and then somewhere store those 
> inferences? Next time you can simply load the inferred model instead of the 
> raw dataset. It is not specific to TDB, you can load dataset A, compute the 
> inferred model in a slow process once, materialize it as dataset B, and later 
> on always work on dataset B - this is standard forward chaining with writing 
> the data back to disk or database. Can you try this procedure, maybe it works 
> for you?
> 
> Indeed this wont work if your rules are currently modeled as backward 
> chaining rules as those are computed at query time always.
> 
> 
> On 02.07.21 13:37, Simon Gray wrote:
>> Thank you Lorenz, although this seems to be a reply to my side comment about 
>> TDB rather than the question I had, right?
>> 
>> The main issue right now is that I would like to use inferencing to get e.g. 
>> inverse relations, but doing this is very slow the first time a query is 
>> run, likely due to some preprocessing step that needs to run first. I would 
>> like to run the preprocessing step in advance rather than running it 
>> implicitly.
>> 
>>> Den 2. jul. 2021 kl. 13.30 skrev Lorenz Buehmann 
>>> <buehm...@informatik.uni-leipzig.de>:
>>> 
>>> you can just add the inferred model to the dataset, i.e. add all triple to 
>>> your TDB. Then you can disable the reasoner afterwards or just omit the 
>>> rules that you do not need anymore
>>> 
>>> On 02.07.21 13:13, Simon Gray wrote:
>>>> Hi there,
>>>> 
>>>> I’m using Apache Jena from Clojure to create new home for the Danish 
>>>> WordNet. I use the Arachne Aristotle library + some additional Java 
>>>> interop code of my own.
>>>> 
>>>> I would like to use OWL inferencing to query e.g transitive or inverse 
>>>> relations. This does seem to work fine although I’ve only tried using the 
>>>> supplied in-memory model for now (and it looks like I will have to create 
>>>> my own instance of a ModelMaker to integrate with TDB 1 or 2).
>>>> 
>>>> However, the first query always seems to run really, really slow. Is there 
>>>> any way to precompute inferred relations so that I don’t have to wait? 
>>>> I’ve tried calling `rebind` and `prepare`, but they don’t seem to do 
>>>> anything.
>>>> 
>>>> Kind regards,
>>>> 
>>>> Simon Gray
>>>> Research Officer
>>>> Centre for Language Technology, University of Copenhagen
>>>> 
>> 

Reply via email to