On Tuesday, 26 January 2016 at 22:33:32 UTC, tsbockman wrote:
1) The prospect of getting something into the standard library
is a huge motivator for (at least some) potential contributors.
I am not sure if that is the right motivation. Sounds like recipe
for bloat. Good libraries evolve from being used in real
applications. Many applications.
characteristics for basic infrastructure. People shouldn't have
to rewrite their entire stack every 6 months just because
someone thought of a better API for some low-level component.
Then don't use libraries from unreliable teams.
Making it through D's formal review process typically raises
code quality quite a lot, and the knowledge that backwards
compatibility is a high priority makes outsiders much more
likely to invest in actually using a library module.
Code quality is one thing, but if it has not been used in many
applications, how can you then know if the abstraction is
There is nothing wrong with having a set of recommended
libraries, e.g. a DSP library with FFT. But having things like
FFT in the standard library is just crap. Even Apple does not do
that, they have a separate library called Accelerate for such
things. There is no way you can have the same interface for FFT
across platforms. The structure of the data is different, the
accuracy is different, all for max performance.
In general the standard library should just be the most basic
things, even file system support is tricky for a system level
programming language. For instance, on some cloud platforms you
don't get to read/write parts of a file. You do it as one big