Hi Markus!
Marco already pointed that one out and I will look into it. Has anyone an idea of the performance with patches of about 40Mio triples changed?
I happen to have a dataset (Fennica Linked Data) with around 40M triples. Creating a HDT + index file out of it takes around 5 minutes IIRC. The idea is that you recreate the file from scratch every time. It corresponds to a graph in a RDF dataset. You can set up several graphs that are backed by HDT files.
What would the PUT operation look like? As I need to replace data and DELETE/INSERT did not work with a first test, I thought that creating a "temporary" graph and moving it once the upload is complete might be an option...
I think you said you used the upload button in the Fuseki UI. I believe that corresponds to a PUT operation, i.e. an atomic replace of a single graph. But you can also do that directly via the HTTP Graph Store API. You can use the s-put tool that comes with Fuseki to access that API.
-Osma -- Osma Suominen D.Sc. (Tech), Information Systems Specialist National Library of Finland P.O. Box 26 (Kaikukatu 4) 00014 HELSINGIN YLIOPISTO Tel. +358 50 3199529 [email protected] http://www.nationallibrary.fi
