On Monday, 20 February 2017 at 17:36:32 UTC, H. S. Teoh wrote:
On Mon, Feb 20, 2017 at 03:00:05PM +0000, timmyjose via
Digitalmars-d-learn wrote: [...]
Just one question about the compilers though - I read on the
Wiki that there are three main compiler distros - dmd, ldc,
and gdc. I code primarily on a mac, and I have installed both
dmd and ldc. A lot of the flags appears to be similar, and for
my small programs, compilation and execution speed appeared to
be almost identical. However, the book suggested using dmd for
dev and probably ldc/gdc for releases. Is this really followed
that much in practice, or should I prefer dmd?
Personally, I use dmd git HEAD for 90% of my D projects,
because (1) I'm such a sucker for the latest and greatest
features, bugfixes, language changes, etc., and (2) I
occasionally contribute to the compiler toolchain (mainly
Phobos, sometimes druntime or dmd itself) and it's much easier
to debug something I use on a regular basis and not have to
switch to a different version or waste time chasing down a
compiler bug that's already been fixed in git HEAD.
Thank you, that's great to hear! I have installed both dmd and
ldc on my mac box and am experimenting with both as well :-)
However, when I need performant code, I pull up my trusty,
rusty old gdc (which, unfortunately, tends to be about a
version or two behind the main dmd release -- I believe Iain is
working on improving this, though). In spite of Walter being a
renowned compiler genius, he simply has too many things on his
plate and working on the optimizer hasn't been a high priority,
so gdc's optimizer easily beats dmd (sometimes by a long
stretch). Don't get me wrong; for your average desktop
application, dmd output is more than good enough. It only
really matters when you're dealing with CPU-intensive
performance-critical things like maintaining framerate in a
complex game engine, or real-time software where somebody dies
if the program fails to respond within a 10ms margin, or when
you're trying to solve a PSPACE-complete exponential problem
where a 20ms difference in inner loop performance could mean
the difference between getting a result next week vs. next year
(or next millenium).
That makes a whole lot of sense.
But if you're a stickler for high-performance code, gdc's
optimizer far outstrips dmd's in just about every way that I
can think of -- more aggressive inlining, better loop
optimization, better codegen in general. And reputedly ldc has
comparable performance gains over dmd as well, so that's
another option. The only downside is that gdc releases are
tied to the gcc release cycle, so it tends to be about a
version or two behind mainline dmd, and ldc is about a version
behind AFAIK. But as far as the basics of D are concerned,
that shouldn't make a big difference, unless you're unlucky
enough to be bitten by a compiler bug that has no workaround
and that's only fixed in the latest dmd release. Thankfully,
though, compiler bugs of that sort have been quite rare (and
getting rarer with recent releases).
One more thing I noticed when I looked into the executable
file (using "nm -gU" on my mac) is that I found two
interesting symbols - _main and _Dmain. On Rust, for
instance, the main function got turned into _main, so I
couldn't use a main in the C code that I was trying to interop
with from my Rust code. In this case, does the same
restriction apply (I am still way too far from dabbling in
interop in D as yet! :-)). I mean, suppose I write some sample
code in C, and I have a local main function to test the code
out locally, will I have to comment that out when invoking
that library from D, or can I keep that as is?
_Dmain is the entry point of your D program, and is only
emitted if you have a main() function in your D code. In that
case, you'll want the druntime version of _main (which does a
bunch of setup necessary before _Dmain is called).
Ah, I see. Now I understand why those two symbols are there!
But if you're calling D from C code, i.e., the C code defines
main(),
then you wouldn't also write a main() in D code (obviously -- I
hope), though you *would* need to call a druntime hook to
initialize some D runtime things needed before you call any D
code. (Sorry, can't remember the exact calls off the top of my
head, but it's a simple matter of calling an init routine or
two at startup, before invoking any D code, then calling the
cleanup routine at the end before the program exits. Pretty
standard stuff.)
T
Got it! So you're saying that in case I want to call D code from
C, then I do need to take care of some initialisation for the D
runtime so that I can call the D library's code. That makes sense
indeed.