According to the std.utf documentation,

decode will only work with strings and random access ranges of code units with length and slicing, whereas decodeFront will work with any input range of code units.

However, I can't seem to get such a usage to compile: the following code

import std.range;
import std.utf;

void somefn(InputRange!(ubyte) r) {
    r.decodeFront!(No.useReplacementDchar, dchar)();
}

gives a compilation error:

onlineapp.d(5): Error: template std.utf.decodeFront cannot deduce function from argument types !(cast(Flag)false, dchar)(InputRange!ubyte), candidates are: /dlang/dmd/linux/bin64/../../src/phobos/std/utf.d(1176): std.utf.decodeFront(Flag useReplacementDchar = No.useReplacementDchar, S)(ref S str, out size_t numCodeUnits) if (!isSomeString!S && isInputRange!S && isSomeChar!(ElementType!S)) /dlang/dmd/linux/bin64/../../src/phobos/std/utf.d(1214): std.utf.decodeFront(Flag useReplacementDchar = No.useReplacementDchar, S)(ref S str, out size_t numCodeUnits) if (isSomeString!S) /dlang/dmd/linux/bin64/../../src/phobos/std/utf.d(1243): std.utf.decodeFront(Flag useReplacementDchar = No.useReplacementDchar, S)(ref S str) if (isInputRange!S && isSomeChar!(ElementType!S))

I'm not sure what's wrong here. Also, the information in the error message seems to indicate that decodeFront isn't expecting to work on a generic InputRange interface, but rather a subset which comes from an actual string. Have I read that wrong? I get the same error with leaving out the useReplacementDchar flag, i.e. `r.decodeFront!(dchar)()`

Reply via email to