Hi,

I have recently had to deal with large amounts of JSON data in D. While doing that I've found that std.json is remarkable slow in comparison to other languages standard json implementation. I've create a small and simple benchmark parsing a local copy of a github API call "https://api.github.com/repos/D-Programming-Language/dmd/pulls"; and parsing it 100% times and writing the title to stdout.

My results as follows:
./d-test > /dev/null 3.54s user 0.02s system 99% cpu 3.560 total ./hs-test > /dev/null 0.02s user 0.00s system 93% cpu 0.023 total python test.py > /dev/null 0.77s user 0.02s system 99% cpu 0.792 total

The concrete implementations (sorry for my terrible haskell implementation) can be found here:

   https://github.com/dsp/D-Json-Tests/

This is comapring D's std.json vs Haskells Data.Aeson and python standard library json. I am a bit concerned with the current state of our JSON parser given that a lot of applications these day use JSON. I personally consider a high speed implementation of JSON a critical part of a standard library.

Would it make sense to start thinking about using ujson4c as an external library, or maybe come up with a better implementation. I know Orvid has something and might add some analysis as to why std.json is slow. Any ideas or pointers as to how to start with that?

Reply via email to