JSON parsing in D has come a long way, especially when you
look at it from the efficiency angle as a popular benchmark
does that has been forked by well known D contributers like
Martin Nowak or Sönke Ludwig.

The test is pretty simple: Parse a JSON object, containing an
array of 1_000_000 3D coordinates in the range [0..1) and
average them.

The performance of std.json in parsing those was horrible
still in the DMD 2.066 days*:

DMD     : 41.44s,  934.9Mb
Gdc     : 29.64s,  929.7Mb
Python  : 12.30s, 1410.2Mb
Ruby    : 13.80s, 2101.2Mb

Then with 2.067 std.json got a major 3x speed improvement and
rivaled the popular dynamic languages Ruby and Python:

DMD     : 13.02s, 1324.2Mb

In the mean time several other D JSON libraries appeared with
varying focus on performance or API:

Medea         : 56.75s, 1753.6Mb  (GDC)
libdjson      : 24.47s, 1060.7Mb  (GDC)
stdx.data.json:  2.76s,  207.1Mb  (LDC)

Yep, that's right. stdx.data.json's pull parser finally beats
the dynamic languages with native efficiency. (I used the
default options here that provide you with an Exception and
line number on errors.)

A few days ago I decided to get some practical use out of my
pet project 'fast' by implementing a JSON parser myself, that
could rival even the by then fastest JSON parser, RapidJSON.
The result can be seen in the benchmark results right now:


fast:      0.34s, 226.7Mb (GDC)
RapidJSON: 0.79s, 687.1Mb (GCC)

(* Timings from my computer, Haswell CPU, Linux amd64.)


Reply via email to