Hi Sophie,

On 17/02/11 21:41, Sophie Yang wrote:
I try to bulk load a large OWL file into TDB. The file imports a bunch of
sub-ontologies using<owl:imports>  element. When I try the tdbloader
command-line tool, it appears that the loader does not recognize owl:imports and
read the imported file. It is understandable since tdb is a rdf store. Another
option I can think of is to build an OntModel first, then write into tdb.
However, my ontology is large (400M triples), although I use a subset with 4M
triples for testing currently. I don't think the ontology can fit into an
in-memory model. Do you have any suggestion on how to make it work?

Use the TDB model as the base model of your OntModel (untested code, but should work):

Model tdbBase = TDBFactory.createModel( 'your/tdb/file' );
OntModel ontTdb = ModelFactory.createOntologyModel( OntModelSpec.OWL_MEM, tdbBase );
ontTdb.read( "http://your.ontology/file"; );

Hth,
Ian


--
____________________________________________________________
Ian Dickinson                   Epimorphics Ltd, Bristol, UK
mailto:[email protected]        http://www.epimorphics.com
cell: +44-7786-850536              landline: +44-1275-399069
------------------------------------------------------------
Epimorphics Ltd.  is a limited company registered in England
(no. 7016688). Registered address: Court Lodge, 105 High St,
              Portishead, Bristol BS20 6PT, UK

Reply via email to