std.string.assumeUTF() silently casting mutable to immutable?
I may have found a bug in assumeUTF(), but being new to D, I'm not sure. The description: Assume the given array of integers arr is a well-formed UTF string and return it typed as a UTF string. ubyte becomes char, ushort becomes wchar and uint becomes dchar. Type qualifiers are preserved. The declaration: ```d auto assumeUTF(T)(T[] arr) if (staticIndexOf!(immutable T, immutable ubyte, immutable ushort, immutable uint) != -1) ``` Shouldn't that precondition's `immutable T` be simply `T`? As it stands, I can do this with no complaints from the compiler... ```d string test(ubyte[] arr) { import std.string; return arr.assumeUTF; } ``` ...and accidentally end up with a "string" pointing at mutable data. Am I missing something?
Re: std.uni CodepointSet toString
On Friday, 9 February 2024 at 08:04:28 UTC, Danilo wrote: Incredible! Seems like D is experiencing featuritis. Priorities may be wrong. Instead of bug fixing and stabilization, people concentrate on getting new stuff like ˋ:blubˋ into the language. If you look at the work actually being done in Github PRs, the vast majority of it is bug fixes and stabilization: https://github.com/dlang/dmd/pulls?q=is%3Apr+is%3Amerged https://github.com/dlang/phobos/pulls?q=is%3Apr+is%3Amerged However, new features are more exciting to talk about, so they get more attention on the forums.
Re: LDC Stacktrace with symbols instead of addresses
On Monday, 12 February 2024 at 16:14:27 UTC, Per Nordlöw wrote: . Doing the same thing with LDC via ```sh ldc2 -g --d-debug -run app ``` gives ``` ld: error: undefined symbol: _D3etc5linux11memoryerror26registerMemoryErrorHandlerFNbZb referenced by app.d:3 /tmp/objtmp-ldc-dec7a7/app.o:(D main) collect2: error: ld returned 1 exit status Error: /usr/bin/cc failed with status: 1 ``` LDC does not support etc.linux.memoryerror, due to issues with it. See: https://github.com/ldc-developers/ldc/issues/1915 -Johan
Re: length's type.
On Mon, Feb 12, 2024 at 07:34:36PM +, bachmeier via Digitalmars-d-learn wrote: > On Monday, 12 February 2024 at 18:22:46 UTC, H. S. Teoh wrote: > > > Honestly, I think this issue is blown completely out of proportion. > > Only for people that don't have to deal with the problems it causes. I've run into size_t vs int issues many times. About half the time it exposed fallacious assumptions on my part about value types. The other half of the time a simple cast or std.conv.to invocation solved the problem. My guess is that most common use of .length in your typical D code is in (1) passing it to code that expect a length for various reasons, and (2) in loop conditions to avoid overrunning a buffer or overshooting some range. (1) is a non-problem, 90% of (2) is solved by using constructs like foreach() and/or ranges instead of overly-clever arithmetic involving length, which is almost always wrong or unnecessary. If you need to do subtraction with lengths, that's a big red flag that you're approaching your problem from the wrong POV. About the only time you need to do arithmetic with lengths is in low-level code like allocators or array copying, for which you really should be using higher-level constructs instead. > > D decided on an unsigned type. You just learn that and adapt your > > code accordingly, end of story. Issues like these can always be > > argued both ways, and the amount of energy spent in these debates > > far outweigh the trivial workarounds in code, of which there are > > many (use std.conv.to for bounds checks, just outright cast it if > > you know what you're doing (or just foolhardy), use CheckedInt, > > etc.). > > A terrible language is one that makes you expend your energy thinking > about workarounds rather than solving your problems. The default > should be code that works. The workarounds should be for cases where > you want to do something extremely unusual like subtracting from an > unsigned type and having it wrap around. Yes, if I had my way, implicit conversions to/from unsigned types should be a compile error. As should comparisons between signed/unsigned values. But regardless, IMNSHO any programmer worth his wages ought to learn what an unsigned type is and how it works. A person should not be writing code if he can't even be bothered to learn how the machine that's he's programming actually works. To quote Knuth: People who are more than casually interested in computers should have at least some idea of what the underlying hardware is like. Otherwise the programs they write will be pretty weird. -- D. Knuth One of the reasons Walter settled on size_t being unsigned is that this reflects how the hardware actually works. Computer arithmetic is NOT highschool arithmetic; you do not have infinite width nor infinite precision, and you're working with binary, not decimal. This has consequences, and having the language pretend the distinction doesn't exist does not solve any problems. If an architectural astronaut works at such a high level of abstraction that he doesn't even understand how basic things about the hardware, like how uint or ulong work and how to use them correctly, maybe he should be promoted to a managerial role instead of writing code. T -- You are only young once, but you can stay immature indefinitely. -- azephrahel
Re: length's type.
On Monday, 12 February 2024 at 18:22:46 UTC, H. S. Teoh wrote: Honestly, I think this issue is blown completely out of proportion. Only for people that don't have to deal with the problems it causes. D decided on an unsigned type. You just learn that and adapt your code accordingly, end of story. Issues like these can always be argued both ways, and the amount of energy spent in these debates far outweigh the trivial workarounds in code, of which there are many (use std.conv.to for bounds checks, just outright cast it if you know what you're doing (or just foolhardy), use CheckedInt, etc.). A terrible language is one that makes you expend your energy thinking about workarounds rather than solving your problems. The default should be code that works. The workarounds should be for cases where you want to do something extremely unusual like subtracting from an unsigned type and having it wrap around.
Re: length's type.
On Monday, 12 February 2024 at 17:26:25 UTC, Nick Treleaven wrote: On Friday, 9 February 2024 at 15:19:32 UTC, bachmeier wrote: It's been discussed many, many times. The behavior is not going to change - there won't even be a compiler warning. (You'll have to check with the leadership for their reasons.) Was (part of) the reason because it would disrupt existing code? If that was the blocker then editions are the solution. I don't want to write a speculative answer on Walter's reasoning, but I know that (a) this has come up many times, and (b) I've never seen him express an opinion that anything in the language related to unsigned types is problematic. I can't imagine that he has any intention of changing it, given the number of times it's been raised, but I can't claim any special knowledge of his views.
Re: length's type.
On Mon, Feb 12, 2024 at 05:26:25PM +, Nick Treleaven via Digitalmars-d-learn wrote: > On Friday, 9 February 2024 at 15:19:32 UTC, bachmeier wrote: > > It's been discussed many, many times. The behavior is not going to > > change - there won't even be a compiler warning. (You'll have to > > check with the leadership for their reasons.) > > Was (part of) the reason because it would disrupt existing code? If > that was the blocker then editions are the solution. Honestly, I think this issue is blown completely out of proportion. The length of stuff in any language needs to be some type. D decided on an unsigned type. You just learn that and adapt your code accordingly, end of story. Issues like these can always be argued both ways, and the amount of energy spent in these debates far outweigh the trivial workarounds in code, of which there are many (use std.conv.to for bounds checks, just outright cast it if you know what you're doing (or just foolhardy), use CheckedInt, etc.). And the cost of any change to the type now also far, far outweighs any meager benefits it may have brought. It's just not worth it, IMNSHO. T -- Verbing weirds language. -- Calvin (& Hobbes)
Re: length's type.
On Friday, 9 February 2024 at 15:19:32 UTC, bachmeier wrote: It's been discussed many, many times. The behavior is not going to change - there won't even be a compiler warning. (You'll have to check with the leadership for their reasons.) Was (part of) the reason because it would disrupt existing code? If that was the blocker then editions are the solution.
Re: LDC Stacktrace with symbols instead of addresses
I agree, debug builds should show proper stack trace by default You should submit a PR for dmd and call what ever is that function behind a `debug` block when it hooks the C main function As for LDC, it's weird that it doesn't work, they should share the same runtime no?
Re: LDC Stacktrace with symbols instead of addresses
On Sunday, 11 February 2024 at 06:43:19 UTC, Per Nordlöw wrote: How do I make LDC stacktraces like ```test-library(+0x1fb232)[0x562230d82232] So it turns out that ldc2 doesn't show symbols in stack traces by default. IMHO, in debug mode D should adhere to what other languages do. Meaning a sane behavior like what ```d int main(string[] args) { import etc.linux.memoryerror : registerMemoryErrorHandler; registerMemoryErrorHandler(); int*x = null; *x = 42; return 0; } ``` does when using compiled and run via ```sh dmd -g -debug -run app ``` gives ``` etc.linux.memoryerror.NullPointerError@src/etc/linux/memoryerror.d(322) ??:? void etc.linux.memoryerror.sigsegvUserspaceProcess(void*) [0x55fd3461e4f6] ??:? void etc.linux.memoryerror.sigsegvDataHandler() [0x55fd3461e42e] ./app.d:4 _Dmain [0x55fd345e53e6] ``` . Doing the same thing with LDC via ```sh ldc2 -g --d-debug -run app ``` gives ``` ld: error: undefined symbol: _D3etc5linux11memoryerror26registerMemoryErrorHandlerFNbZb referenced by app.d:3 /tmp/objtmp-ldc-dec7a7/app.o:(D main) collect2: error: ld returned 1 exit status Error: /usr/bin/cc failed with status: 1 ``` .