On Monday, 4 January 2016 at 14:16:54 UTC, Marc Schütz wrote:
On Monday, 4 January 2016 at 13:49:03 UTC, Martin Tschierschke
wrote:
When I was writing a small speed test - D versus Ruby,
calculating the first n prime numbers, I realized, that for
small n
Ruby may be faster, than compiling and executing with D.
But for n = 1,000,000 D outperforms Ruby by app. 10x.
Looking at the size of my prime executable, it was around 800
kB with DMD
and even with optimization and "gdc -Os" > 1 MB.
Why is such a short program resulting in a so big binary?
That's probably basic these compilers statically link the
runtime (and standard?) libraries by default. Compiling your
program with `ldc2 -O3`, I get a binary of 28K, and stripping
gets it down to 17K.
Ok, I will try ldc2, too.