Turns out, JSON-XS-3.01 supports incremental
parsing<http://search.cpan.org/~mlehmann/JSON-XS-3.01/XS.pm#INCREMENTAL_PARSING>.
I don't know if it's a big deal, but looks worth re-implementing.




On Sun, Jan 19, 2014 at 10:17 AM, Ruslan Shvedov
<[email protected]>wrote:

> On Sun, Jan 19, 2014 at 8:34 AM, Jeffrey Kegler <
> [email protected]> wrote:
>
>>  Yes, I think it's a great exercise for Marpa, and it's one I've run
>> often.  I'd be glad to see someone else running it.
>>
>
>
>> And, yes, running against JSON::PP is something I often do.  JSON::XS is
>> beautifully written C code dedicated to parsing that one grammar -- Marpa
>> is never going to beat that.
>>
> Well most probably yes, but I included it to have the supertask. It really
> energizes. :) Also, I have high hopes for THIF in this dept.
>
>
>> For that matter, JSON::PP is also beautiful written, highly optimized,
>> highly efficient code, but it's pure Perl, and something that in theory
>> Marpa could beat.  I've often benchmarked against it.
>>
> Yes, this is most likely prey
>
>> The JSON parser you're using looks like it was taken from my experimental
>> directory, and I've no idea what I thought of it.  I might have been left
>> the way it is because it was as good as I could get it.  Or I might have
>> tried a totally counter-productive innovation and just given up, leaving
>> what I considered a disastrous experiment as it was.  One thing for
>> certain, though.  In my experiments I always try to optimize for long
>> inputs.
>>
>> Anyway, I'll be very glad to hear about the results of your experiments,
>> if you chose to do it. -- jeffrey
>>
> Sure, the results so far (gnu time run until the results are stable):
>
> ./t/01-json_JSON-XS.t
> 0:00.04 elapsed, 0.06 user, 0.01 system, 165% CPU, 454656 max-mem
> footprint in KB
>
> ./t/02-json_JSON-PP.t
> 0:00.07 elapsed, 0.06 user, 0.03 system, 117% CPU, 507136 max-mem
> footprint in KB
>
> ./t/03-sl_json_no_trace_grammar_reused.t
> 0:00.21 elapsed, 0.17 user, 0.07 system, 113% CPU, 860928 max-mem
> footprint in KB
>
> ./t/04-sl_json_asf.t
> 0:00.26 elapsed, 0.28 user, 0.01 system, 111% CPU, 859904 max-mem
> footprint in KB
>
> The files are on 
> github<https://github.com/rns/MarpaX-Languages-JSON-Benchmark>.
> Humble beginnings, really, but that's a start. :)
>
> All of them run tests from Marpa's 
> sl_json.t<https://github.com/jeffreykegler/Marpa--R2/blob/master/cpan/t/sl_json.t>
>  except
> the tracing test. sl_json_no_trace_grammar_reused.t also reuses the
> grammar. 04-sl_json_asf.t uses ASF traverser semantics.
>
> So, unless I messed something in a hurry, in this benchmark,
>
> 1. JSON-XS is not much faster than JSON:PP.
> 2. Marpa's best is only 3 times slower than JSON-PP.
>
> Not bad for a start, not bad at all. And great progress (some dozen times,
> IIRC) vs. the beginnings.
>
> It'd be interesting to see how Marpa'll fare against JSON-PP|XS test suite
> and and increments of a big file. If the Perl overhead grows linearly, then
> so should the parser, am I correct? If it's not, we'll see. :)
>
> I'm planning to try Marpa-based parsers MarpaX::Languages::JSON::Benchmark
> and release the best current as MarpaX::Languages::JSON::AST.
>
>
>>  On 01/18/2014 09:56 AM, Ruslan Shvedov wrote:
>>
>>  On Sat, Jan 18, 2014 at 6:42 PM, Jeffrey Kegler <
>> [email protected]> wrote:
>>
>>>  This is from MarpaX::Demo::JSONParser, right?  That gives you a choice
>>> of two grammars, and if you chose mine, it was written for benchmarking
>>> with long input strings, so I wasn't optimizing for repeated parses of
>>> short strings.   There's also Peter Stuifzand's grammar, and Peter's
>>> intentions were probably more practical.
>>>
>> This is a 'fork' of cpan/s/sl_json.t so the grammar must be yours.
>>
>>  MarpaX::Demo::JSONParser is now Ron Savage's module, so if you're
>>> interested in changes to it, you should talk to Ron.
>>>
>> Sure, but my thinking was more about MarpaX::Languages::JSON::Benchmark —
>> a benchmark for Marpa valuator options (ASF::traverse(), AST with newly
>> implemented action => *[lhs, value(s)] *and traverse(), bless_package,
>> semantic_package, and a THIF-based parser based on existing grammar I was
>> thinking of writing) vs other parsing techniques and JSON looked like a
>> good target with JSON::XS, JSON::PP.
>>
>>  First I planned to make Marpa-based parsers to pass JSON::XS|PP test
>> suite and use some big file (like 
>> this<https://github.com/zeMirco/sf-city-lots-json>)
>> to test how execution time/memory grows with size.
>>
>>  Looks like a good exercise for Marpa?
>>
>>
>>>
>>> --jeffrey
>>>
>>> On 01/18/2014 03:00 AM, Ruslan Shvedov wrote:
>>>
>>> Is it intended that the JSON grammar is not reused? The script would 2
>>> times faster if it were.
>>>
>>>
>>> On Thu, Jan 16, 2014 at 8:53 PM, Jeffrey Kegler <
>>> [email protected]> wrote:
>>>
>>>>  $naif_recce->value() is being called to parse the JSON grammar, as
>>>> opposed to the JSON itself.  Note that with short strings, as in your
>>>> examples, start-up costs play a greater role. -- jeffrey
>>>>
>>>>   On 01/16/2014 08:26 AM, rns wrote:
>>>>
>>>> Devel::NYTProf says that both scripts spend much time in
>>>> Marpa::R2::Recognizer::value — is it expected with ASF's?
>>>>
>>>>  sl_json_asf
>>>>
>>>>   Calls P F Exclusive
>>>> Time Inclusive
>>>> Time Subroutine  13 1 1 203ms 272ms Marpa::R2::Recognizer::value
>>>>
>>>> sl_json_no_trace
>>>>   Calls P F Exclusive
>>>> Time Inclusive
>>>> Time Subroutine  26 1 1 231ms 312ms Marpa::R2::Recognizer::value  1152
>>>> 1 1 105ms 152ms Marpa::R2::Internal::Grammar::add_user_rule
>>>>
>>>>
>>>>  On Thursday, January 16, 2014 5:55:55 PM UTC+2, rns wrote:
>>>>>
>>>>>  I once thought that traversing ASF's can be a faster way of doing
>>>>> semantics because ASF's *value() *is not called (actions are not
>>>>> applied) and grammar can be absolutely syntactic, so I decided to try it
>>>>> with *sl_json.t* from Marpa test suite (latest dev version, cygwin,
>>>>> 5.14.2).
>>>>>
>>>>>  Changes made to original sl_json.t from the test suite
>>>>>
>>>>> removed tracing test and code
>>>>>
>>>>>   Writing semantics via ASF is like this: first I got the default
>>>>> action that produced results as Perl data structure, dumped with YAML to
>>>>> see what's there as needed and then added handlers via if/else. AST
>>>>> semantics is easier because many handlers can be left to default.
>>>>>
>>>>>  Changes made to the grammar after converting semantics to ASF
>>>>>
>>>>> remove all action adverbs
>>>>>
>>>>>  removed events and changed read()/resume() loop to just read()
>>>>> removed round brackets around literals
>>>>> removed bless_package
>>>>>
>>>>>
>>>>>  The net result is that 
>>>>> sl_json_asf.t<https://gist.github.com/rns/8456567>is a bit slower than
>>>>> sl_json_no_trace.t <https://gist.github.com/rns/8456626> when
>>>>> measured with *time* and is equal within 2-3% when quickly and
>>>>> dirtily Benchmark-ed [1]. So, now it looks like Perl overhead over 
>>>>> libmarpa
>>>>> is pretty much constant across all ways of doing semantics in Marpa::R2.
>>>>>
>>>>>  Is it as expected or I missed something?
>>>>>
>>>>>  I'm thinking about cheaper ASF traversing like the low-level
>>>>> synopsis code in Marpa::R2::Glade. Makes sense?
>>>>>
>>>>>  Good news, though, is that ASF semantics is not much slower, if a
>>>>> bit harder to write.
>>>>>
>>>>>  [1]
>>>>> use Benchmark qw{ cmpthese };
>>>>> cmpthese(500, {
>>>>>      'json_asf' => q{ eval `cat sl_json_asf.t` },
>>>>>     'json_no_trace' => q{ eval `cat sl_json_no_trace.t` },
>>>>> });
>>>>>
>>>>>
>>>>>
>>>>>     --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "marpa parser" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>>
>>>>
>>>>    --
>>>> You received this message because you are subscribed to the Google
>>>> Groups "marpa parser" group.
>>>> To unsubscribe from this group and stop receiving emails from it, send
>>>> an email to [email protected].
>>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>>
>>>
>>>  --
>>> You received this message because you are subscribed to the Google
>>> Groups "marpa parser" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to [email protected].
>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>
>>>
>>>    --
>>> You received this message because you are subscribed to the Google
>>> Groups "marpa parser" group.
>>> To unsubscribe from this group and stop receiving emails from it, send
>>> an email to [email protected].
>>> For more options, visit https://groups.google.com/groups/opt_out.
>>>
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "marpa parser" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> For more options, visit https://groups.google.com/groups/opt_out.
>>
>>
>>  --
>> You received this message because you are subscribed to the Google Groups
>> "marpa parser" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> For more options, visit https://groups.google.com/groups/opt_out.
>>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"marpa parser" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to