On 10/12/10 23:14, Benson Margulies wrote:
I want to make an InfModel that puts an inference engine on top of a
TDB dataset so that inference happens over the set. If this isn't a
nutty idea altogether,
Not nutty, but performance may not be great since the inference engine makes many queries into the model, and going to disk is slow ...

is there any way to do this without using the
assembler?
Yes.

Model baseModel = ... yourTDBModel ...;
OntModel infModel = ModelFactory.createOntologyModel(
        OntModelSpec.OWL_MEM_MICRO_RULE_INF,    // deductions in memory
        baseModel                               // axioms in TDB
);


Ian

--
____________________________________________________________
Ian Dickinson                   Epimorphics Ltd, Bristol, UK
mailto:[email protected]        http://www.epimorphics.com
cell: +44-7786-850536              landline: +44-1275-399069
------------------------------------------------------------
Epimorphics Ltd.  is a limited company registered in England
(no. 7016688). Registered address: Court Lodge, 105 High St,
              Portishead, Bristol BS20 6PT, UK

Reply via email to