JSON is not that hard to parse incrementally. The i-json parser is
implemented in C++ with a fallback JS implementation. The C++
implementation is less than 1000 locs and the JS implementation less than
400 locs. The C++ implementation is 1.65 times slower than JSON.parse but,
unlike JSON.parse, it
On Tue, Aug 4, 2015 at 9:53 AM, Mark Miller wrote:
> +1 for line delimited JSON. It would be good to switch all users of
> json-seq over to it and to deprecate json-seq. Perhaps an RFC would help.
>
>
> On Mon, Aug 3, 2015 at 11:53 PM, Bruno Jouhier wrote:
>
>> RFC 7464 has a different format (0
+1 for line delimited JSON. It would be good to switch all users of
json-seq over to it and to deprecate json-seq. Perhaps an RFC would help.
On Mon, Aug 3, 2015 at 11:53 PM, Bruno Jouhier wrote:
> RFC 7464 has a different format (0x1E at beginning of every record) and a
> different media type
eful to
> process a normal JSON object in a streaming fashion. That seems like a
> harder problem, indeed necessitating a SAX-like API.
>
> -Original Message-
> From: es-discuss [mailto:es-discuss-boun...@mozilla.org] On Behalf Of
> Brendan Eich
> Sent: Sunday, August
Hi Domenic,
We have a spec for that: RFC 7464
Grüße, Carsten
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss
> > Brendan Eich
> > Sent: Sunday, August 2, 2015 21:26
> > To: Bruno Jouhier
> > Cc: es-discuss
> > Subject: Re: Please help with writing spec for async JSON APIs
> >
> > Exactly! Incremental and async, i.e., streaming.
> >
> > XML quickly needed
. That seems like a harder
problem, indeed necessitating a SAX-like API.
> -Original Message-
> From: es-discuss [mailto:es-discuss-boun...@mozilla.org] On Behalf Of
> Brendan Eich
> Sent: Sunday, August 2, 2015 21:26
> To: Bruno Jouhier
> Cc: es-discuss
> Subjec
On Aug 3, 2015, at 12:30 PM, Bruno Jouhier wrote:
> Reviver is a bit of a killer feature for async parsing because it imposes a
> callback on every key. It makes it difficult to efficiently offload parsing
> to a worker thread. Without it, feed entries could be parsed and materialized
> safely
Reviver is a bit of a killer feature for async parsing because it imposes a
callback on every key. It makes it difficult to efficiently offload parsing
to a worker thread. Without it, feed entries could be parsed and
materialized safely (provided GC allows it) in a separate thread and then
emitted
On Mon, Aug 3, 2015 at 10:29 AM, Allen Wirfs-Brock
wrote:
[snip]
>
> I have to guess at your semantics, but what you are trying to express above
> seems like something that can already be accomplished using the `reviver`
> argument to JSON.parse.
>
Yes and no. `reviver` achieves part of goal bu
On Aug 3, 2015, at 9:02 AM, James M Snell wrote:
> On Mon, Aug 3, 2015 at 8:34 AM, Allen Wirfs-Brock
> wrote:
> [snip]
>>
>> 4) JSON.parse/stringify are pure computational operations. There is no
>> perf benefit to making them asynchronous unless some of their computation
>> can be performed
On Mon, Aug 3, 2015 at 8:34 AM, Allen Wirfs-Brock wrote:
[snip]
>
> 4) JSON.parse/stringify are pure computational operations. There is no
> perf benefit to making them asynchronous unless some of their computation
> can be performed concurrently.
>
If we're speaking strictly about making the J
On 8/3/15 11:56 AM, Allen Wirfs-Brock wrote:
sure, but that's a user interactiveness benefit, not a "perf benefit".
OK, fair. I just wanted it to be clear that there is a benefit to
incremental/asynchronous behavior here apart from raw throughput.
-Boris
On Aug 3, 2015, at 8:45 AM, Boris Zbarsky wrote:
> On 8/3/15 11:34 AM, Allen Wirfs-Brock wrote:
>> 4) JSON.parse/stringify are pure computational operations. There is no
>> perf benefit to making them asynchronous unless some of their
>> computation can be performed concurrently.
>
> Or even j
On 8/3/15 11:34 AM, Allen Wirfs-Brock wrote:
4) JSON.parse/stringify are pure computational operations. There is no
perf benefit to making them asynchronous unless some of their
computation can be performed concurrently.
Or even just incrementally, right?
In practice, 500 chunks of 5ms of pr
So, to summarize some things that have been said or are implicit in this
thread and related discussions:
1) New JSON APIs could be added to JS. We don’t have to be limited to
JSON.parse/stringify
2) We don’t have to be restricted to the JSON.stringify/parse mapping of JS
objects from/to JSON
The SAX approach is not ideal for JSON because we don't want the overhead
of a callback on every key (especially if parsing and callbacks are handled
by different threads).
To be efficient we need is a hybrid approach with an evented API (SAX-like)
for top level keys, and direct mapping to JS for
Personally I just use small JSON records delimited by newlines in my
'streaming' applications. Best of both worlds IMO.
On Monday, 3 August 2015, Brendan Eich wrote:
> Exactly! Incremental and async, i.e., streaming.
>
> XML quickly needed such APIs (
> https://en.wikipedia.org/wiki/Simple_API_f
Exactly! Incremental and async, i.e., streaming.
XML quickly needed such APIs
(https://en.wikipedia.org/wiki/Simple_API_for_XML,
https://en.wikipedia.org/wiki/StAX). JSON's in the same boat.
/be
Bruno Jouhier wrote:
A common use case is large JSON feeds: header + lots of entries + trailer
A common use case is large JSON feeds: header + lots of entries + trailer
When processing such feeds, you should not bring the whole JSON in memory
all at once. Instead you should process the feed incrementally.
So, IMO, an alternate API should not be just asynchronous, it should also
be incremen
Synchronous JSON parsing can block Node.js application. See following test
case - Chromium native parser can handle up to 44MB per second on my
hardware. http://jsperf.com/json-parse-vs-1mb-json (BTW - I'm quite
impressed by V8 garbage collector). http://jsperf.com/json-parse-vs-1mb-json
It's eno
If we're speaking normatively, some of us don't see the point of using
unstructured object serialization for web communication at all (it's not
simply a binary-versus-text thing; I once implemented JSON's object model
in a binary format, and it had the same speed as the native JSON parser).
That sa
I agree that we should probably look for a more general solution. What we have
at the moment is woefully inadequate though (I.e. WebWorkers on the client and
separate processes in node.js
What we need some way of doing multi-threading baked into the language, but
that could take a long time to
I confess I don't see the point of this proposal at all, at least with respect
to being specifically about JSON.
JSON parsing/stringification is pure computation; it's not like I/O where you
need something special inside the language runtime's implementation in order to
exploit the asynchrony.
You probably don’t want to support reviver/replacer in the async methods as
they would be very challenging to make performant.
___
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss
JSON parsing is such a slow process that it motivated me to re-invent
Google Protobufs (in a nice, JS-friendly way, see
https://github.com/joeedh/STRUCT/wiki/Intro-and-Examples ). I never use
JSON in production code for this reason. An async api isn't a bad idea.
Joe
On Fri, Jul 31, 2015 at 8:3
Hi Moshen,
The semantics of your proposal are straightforward, so I don't think you
need to provide spec text at this point. Instead, what would be helpful is
a quantitative analysis showing why these additional methods are needed.
Is there any way you can demonstrate the benefit with numbers?
27 matches
Mail list logo