On 05/01/16 09:50, Rob Vesse wrote:
What version of Jena is this?

Trying to parse large updates into memory always risks potentially hitting
memory issues though the fact you get StackOverflowError seems a little
odd.  Can you share the update via a Gist/Pastebin/etc?

Depending on how you want to evaluate updates ARQ does support processing
updates in a pure streaming fashion, this is how Fuseki can accept
arbitrarily large updates.  To do this you need to use one of the variants
of UpdateAction.parseExecute() - please try that and see if it resolves
the issue.

Note that this only works if you are updating data exposed via the
DatasetGraph/Graph/Model interfaces so will not be of any use if you are
trying to update a remote store via HTTP

Rob

Presumably this stacktrace is not from inside Fuseki.


Try parsing with "Syntax.syntaxARQ".
Or globally set "Syntax.defaultUpdateSyntax" to that value.


The "ARQ" language is a superset of SPARQL and it also includes grammar improvements for some SPARQL forms including DATA.

When using strict SPARQL, the parser is also following strictly the way it is written in the spec. The spec grammar is simple parsing (it's an LL(1) grammar) but it is recursive on TriplesTemplate.

When using the ARQ form, the recursive points are written - same Abstract Syntax Tree output, different way to get there and it uses a local lookahead of 2. [*]

I tested (Jena3; you have some kind of Jena2) and parsed 25K triples in INSERT DATA and it works for ARQ where it does not work for SPARQL. IIRC that grammar rewrite is quite old. It may be in your version.

Fuseki accepts "ARQ".  Stackoverflow shouldn't happen for this.

        Andy

[*]

void TriplesTemplate(TripleCollector acc) : { }
{    // same as ConstructTriples
#if SPARQL_11
    // Version for the spec.
    TriplesSameSubject(acc)
    (<DOT> (TriplesTemplate(acc))?)?
#endif
#ifdef ARQ
    // Rewrite for no recursion - grammar is not LL(1)
    TriplesSameSubject(acc)
    (LOOKAHEAD(2) (<DOT>) TriplesSameSubject(acc))*
    (<DOT>)?
#endif
}



On 04/01/2016 19:12, "Zen 98052" <[email protected]> wrote:

Hi,

I have a big INSERT DATA query, which it has about 20K triples.

I passed the query string to UpdateFactory.create(), and it threw
exception.


at
com.hp.hpl.jena.sparql.lang.ParserSPARQL11Update._parse(ParserSPARQL11Upda
te.java:80)
at
com.hp.hpl.jena.sparql.lang.ParserSPARQL11Update.parse$(ParserSPARQL11Upda
te.java:41)
at com.hp.hpl.jena.sparql.lang.UpdateParser.parse(UpdateParser.java:39)
at com.hp.hpl.jena.update.UpdateFactory.make(UpdateFactory.java:88)
at com.hp.hpl.jena.update.UpdateFactory.create(UpdateFactory.java:79)
at com.hp.hpl.jena.update.UpdateFactory.create(UpdateFactory.java:57)
at com.hp.hpl.jena.update.UpdateFactory.create(UpdateFactory.java:47)
...
Caused by: java.lang.StackOverflowError
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11TokenManager.jjMoveNfa
_0(SPARQLParser11TokenManager.java:2216)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11TokenManager.jjMoveStr
ingLiteralDfa2_0(SPARQLParser11TokenManager.java:421)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11TokenManager.jjMoveStr
ingLiteralDfa1_0(SPARQLParser11TokenManager.java:341)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11TokenManager.jjMoveStr
ingLiteralDfa0_0(SPARQLParser11TokenManager.java:151)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11TokenManager.getNextTo
ken(SPARQLParser11TokenManager.java:3753)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11.jj_ntk(SPARQLParser11
.java:5026)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11.Verb(SPARQLParser11.j
ava:2535)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11.PropertyListNotEmpty(
SPARQLParser11.java:2503)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11.TriplesSameSubject(SP
ARQLParser11.java:2469)
at
com.hp.hpl.jena.sparql.lang.sparql_11.SPARQLParser11.TriplesTemplate(SPARQ
LParser11.java:1619)

Is there a workaround for this, besides breaking down the query (tried
with 5K triples and it works fine)?


Thanks,

Z






Reply via email to