Hi,

owlURL is the url of the bdrc.owl file that is used to create the
Ontology model. 

public static final String owlURL="https://raw.githubusercontent.com/Bu
ddhistDigitalResourceCenter/owl-schema/master/bdrc.owl";
 
Marc

Le mercredi 14 mars 2018 à 20:25 +0000, Andy Seaborne a écrit :
> 
> On 14/03/18 19:12, Élie Roux wrote:
> > > In the case of inference then yes there is also an upfront cost
> > > of 
> > > computing the inferences.  Once computed these are typically
> > > cached 
> > > (though this depends on the rule set) and any changes to the
> > > data 
> > > might invalidate that cache.  You can call prepare() on the
> > > InfModel 
> > > to incur the initial computation cost separately, otherwise the 
> > > initial computation cost is incurred by whatever operation first 
> > > accesses the InfModel.  And as your email shows subsequent calls
> > > don't 
> > > incur that cost and are much faster.
> > 
> > I don't disagree, but I think there's a problem of scale here: even
> > with
> > a cold JVM and a not-too-efficient reasoner, it seems totally
> > unreasonable that a reasoner would take 60 full seconds (that's
> > what
> > Marc's test is taking on my machine) to run inference on a very
> > small
> > dataset already loaded in memory... 60s for such a small operation
> > really seems to indicate a bug to me. But maybe it doesn't...
> 
> 
> What's owlURL?
> 
> The second time cost is does not incur the forward inference rules,
> 9ms.
> 
> (Actually, if you see 60s and Marc sees 18s, something else is going
> on 
> as well)
> 
> > 
> > Thank you,

Reply via email to