On 10/13/17 4:27 PM, Andrew Edwards wrote:
On Friday, 13 October 2017 at 19:17:54 UTC, Steven Schveighoffer wrote:
On 10/13/17 2:47 PM, Andrew Edwards wrote:
A bit of advice, please. I'm trying to parse a gzipped JSON file retrieved from the internet. The following naive implementation accomplishes the task:

     auto url = "http://api.syosetu.com/novelapi/api/?out=json&lim=500&gzip=5";;
     getContent(url)
         .data
         .unzip
         .runEncoded!((input) {
             ubyte[] content;
             foreach (line; input.byLineRange!true) {
                 content ~= cast(ubyte[])line;
             }
             auto json = (cast(string)content).parseJSON;

input is an iopipe of char, wchar, or dchar. There is no need to cast it around.

In this particular case, all three types (char[], wchar[], and dchar[]) are being returned at different points in the loop. I don't know of any other way to generate a unified buffer than casting it to ubyte[].

This has to be a misunderstanding. The point of runEncoded is to figure out the correct type (based on the BOM), and run your lambda function with the correct type for the whole thing.

I'm not sure actually this is even needed, as the data could be coming through without a BOM. Without a BOM, it assumes UTF8.

Note also that getContent returns a complete body, but unzip may not be so forgiving. But there definitely isn't a reason to create your own buffer here.

this should work (something like this really should be in iopipe):

while(input.extend(0) != 0) {} // get data until EOF

This!!! This is what I was looking for. Thank you. I incorrectly assumed that if I didn't process the content of input.window, it would be overwritten on each .extend() so my implementation was:

ubyte[] json;
while(input.extend(0) != 0) {
     json ~= input.window;
}

This didn't work because it invalidated the Unicode data so I ended up splitting by line instead.

Sure enough, this is trivial once one knows how to use it correctly, but I think it would be better to put this in the library as extendAll().

ensureElems(size_t.max) should be equivalent, though I see you responded cryptically with something about JSON there :)

I will try and reproduce your error, and see if I can figure out why.

-Steve

Reply via email to