On 08/02/15 13:55, Jack Park wrote:
Hi Dave,

Thanks for your response.
I should have stated more clearly that the code I did show is *all* the
code that is running. That snippet:

Dataset dataset = TDBFactory.createDataset(dbPath) ;

is what is running when the system gets an OutOfMemory error. Even with a
4gb heap, it still blows. All the code which does "begin", Model = ...,"
and so forth has been commented out.

The behavior according to the log is that somewhere in the createDataset
code, it is reading every class in the ontology stored in the TDB database
it is, I presume, opening.

What log? If that is literally the only code line then there's nothing to write a log and certainly nothing that will go round trying to read classes.

Dave

That's the current puzzle. It's almost as if there is some system property
I need to set somewhere which tells it that this is not an in-memory event.

Thoughts?

Jack

On Sun, Feb 8, 2015 at 2:06 AM, Dave Reynolds <[email protected]>
wrote:

On 07/02/15 22:18, Jack Park wrote:

I used the Jena to load, on behalf of Fuseki, a collection of owl files.
There might be 4gb of data all totaled in there.

Now, rather than use Fuseki to access that data, I am writing code which
will use a Dataset opened against that database to create an OntModel.

I use this code, taken from a variety of sources:

Dataset dataset = TDBFactory.createDataset(dbPath) ;

where dbPath points to the directory where Jena made the database.

When I boot Fuseki against that data, it boots quickly and without any
issues.

When I run that code against the same data, firstly, it blossoms a logfile

260 mb, showing all the ont classes it is reading. Then, it runs out of

heap space and crashes.


Simply accessing data in a TDBDataset won't load it all into memory so the
problem will be in how you are creating the OntModels.

Since you don't show "that code" it is hard know what the problem is.

It *might* be you have dynamic imports processing switched on and so your
OntModels are going out to the original sources and reloading them.

It is possible to do imports processing but have the imports be found as
database models [1] but in your case since you have all the ontologies in
there anyway then I would just switch off all imports processing.

Or it might be nothing to do with imports processing but a bug in how you
are creating the OntModels. Not enough information to tell.

Dave

[1] There used to be a somewhat old example of how to do this in the
documentation but I can't find it in the current web site.




Reply via email to