On Wednesday, 20 December 2023 at 18:16:00 UTC, Renato wrote:
I wanted to write a small program using betterC that needs to
use int128.
This simple program works without -betterC:
```d
module dasc;
import std.int128;
import core.stdc.stdio;
extern (C):
int main() {
auto n = Int128(128, 128);
printf("n=%lu%lu", n.data.hi, n.data.lo);
return 42;
}
```
But with -betterC:
```
dmd -L-ld_classic -betterC -run dasc.d
Undefined symbols for architecture x86_64:
"__D3std6int1286Int1286__ctorMFNaNbNcNiNfllZSQBpQBoQBk",
referenced from:
_main in dasc.o
ld: symbol(s) not found for architecture x86_64
clang: error: linker command failed with exit code 1 (use -v to
see invocation)
Error: linker exited with status 1
Compilation exited abnormally with code 1 at Wed Dec 20 19:11:56
```
I don't see anywhere whether int128 is supposed to work on
-betterC. Is it not supported?
Would there be any reason not to use `core.int128`?
Currently, I used it to make a TigerbeetleDB client (Zig
database) in D (betterC).
https://github.com/batiati/tigerbeetle-clients-benchmarks/blob/f86216834bd04e1e06bede2a2e31b64df0dc98f1/d/modules/tb_client.d#L12