GitHub user afs added a comment to the discussion: How to load big dataset to 
new database

I don't have access to a hardware setup that is similar to yours. I was using a 
local NVMe connected SSD. It does seem to be related to I/O load.

`tdb2.xloader` is less I/O intensive. But it loads all the data at once and 
does not incrementally load a database. This loader has been used to load over 
10B on a laptop (Dell XPS, local NVMe SSD, only two channels). It does make the 
laptop a bit hot!

So the other choice is a database like oxigraph which uses RocksDB (I checked 
with the project and it does do incremental loads). RocksDB is more 
write-oriented.

Sorry for the non-answer but without hardware to recreate I can only speculate.

GitHub link: 
https://github.com/apache/jena/discussions/3701#discussioncomment-15666553

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to