Re: 10g data loading in Fuseki

2017-01-18 Thread Andy Seaborne
Reihan, How are you uploading the file? Via the UI or via the API? The UI has a limitation whereby the incoming file needs to buffer the incoming whereas directly (e.g. the s-put script, or curl, wget). 10G would need a a lot of heap in the But the error will be java.lang.OutOfMemoryError

Re: 10g data loading in Fuseki

2017-01-18 Thread A. Soroka
It's surely true that a 10GB file is not appropriate for direct upload. You could, if you absolutely must, split your NTriples file into many pieces and make many SPARQL Updates with them. But Osma's suggestion is much better, especially because if you are starting from an empty dataset you

Re: 10g data loading in Fuseki

2017-01-18 Thread Osma Suominen
Hi Reihan, You cannot upload files this big via Fuseki. Try tdbloader or tdbloader2 instead for batch loading your triples into TDB outside Fuseki. -Osma 18.01.2017, 17:07, Reihaneh Amini kirjoitti: Hi Dear sir or madam, I have a frustrating problem which is not going well with Fuseki. I

Fwd: 10g data loading in Fuseki

2017-01-18 Thread Reihaneh Amini
Hi Dear sir or madam, I have a frustrating problem which is not going well with Fuseki. I have a .nt file size 10G and I want to upload it into fuseki server as TDB structure not in-memory. After running the server, if I upload it one-time I get SessionTimesOut error, how can I address this

10g data loading in Fuseki

2017-01-18 Thread Reihaneh Amini
Hi Dear sir or madam, I have a frustrating problem which is not going well with Fuseki. I have a .nt file size 10G and I want to upload it into fuseki server as TDB structure not in-memory. After running the server, if I upload it one-time I get SessionTimesOut error, how can I address this