[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 Iain Buclaw changed: What|Removed |Added Priority|P1 |P4 --
[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 Basile-z changed: What|Removed |Added CC|b2.t...@gmx.com | --
[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 --- Comment #5 from anonymous4--- https://forum.dlang.org/post/nh2o9i$hr0$1...@digitalmars.com - I suppose discussion was there. --
[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 --- Comment #4 from anonymous4--- I'd say the spec just specifies encodings for strings, meaning that it can't be something else like EBCDIC or cp1252. There was a debate whether invalid utf violates type system and an idea that invalid utf can produce an exception, a replacement character or be ignored. --
[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 --- Comment #3 from FeepingCreature--- It has to, because it returns string and string is defined to be UTF-8. If it wants to return something that is not UTF-8, it should return ubyte[], and you should have to cast it to string explicitly. --
[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 Basile B.changed: What|Removed |Added CC||b2.t...@gmx.com --- Comment #2 from Basile B. --- It doesn't have to. hexString isn't even design to represent strings literals, it can be a memory dump as well that can be cast to ubyte[]. --
[Issue 18532] Hex literals produce invalid strings
https://issues.dlang.org/show_bug.cgi?id=18532 --- Comment #1 from FeepingCreature--- Update: std.conv.hexString does not validate its return value either. --