On Wednesday, 7 December 2016 at 11:48:32 UTC, bachmeier wrote:
I write D code all the time for my research. I want to write
correct code quickly. My time is too valuable to spend weeks
writing code to cut the running time by a few minutes. That
might be fun for some people, but it doesn't pay the bills.
It's close enough to optimized C performance out of the box.
But ultimately I need a tool that provides fast code, has
libraries to do what I want, and allows me to write a correct
program with a limited budget.
This is, of course, not universal, but zero overhead is not
important for most of the numerical code that is written.
I understand and I do agree with these points, honestly. These
points are also the reason why I will maybe try to use D for my
own codes (D is really much better than C++ concerning template,
meta programming syntax, embedded unit tests etc...).
However I think that to popularize/attract people to use D, it is
very important, to have a mechanism/feature that allows you to be
close to the "zero overhead" situation.
If you have two concurrent libraries (even in different
languages), people will adopt the fastest one... As an example,
look at the BLAS lib, people do not try to read/understand the
code to see how nice it is, they just look at benchmarks and take
the fastest implementation for their architecture. IMHO that is
the reason why D must let the opportunity, for those who want
(library developers for instance) of coding down to the metal:
the goal is to have visibility in benchmarks and to attract users.
At least it is my point of view.
-- Vincent