On Thursday, 11 May 2017 at 15:53:40 UTC, Jonathan M Davis wrote:
On Monday, May 08, 2017 23:15:12 H. S. Teoh via Digitalmars-d wrote:
Recently I've had the dubious privilege of being part of a department wide push on the part of my employer to audit our codebases (mostly C, with a smattering of C++ and other code, all dealing with various levels of network services and running on hardware expected to be "enterprise" quality and "secure") and fix security problems and other such bugs, with the help of some static analysis tools. I have to say that even given my general skepticism about the quality of so-called "enterprise" code, I was rather shaken not only to find lots of confirmation of my gut feeling that there are major issues in our codebase, but even more by just HOW MANY of them there are.

In a way, it's amazing how successful folks can be with software that's quite buggy. A _lot_ of software works just "well enough" that it gets the job done but is actually pretty terrible. And I've had coworkers argue to me before that writing correct software really doesn't matter - it just has to work well enough to get the job done. And sadly, to a great extent, that's true.

However, writing software that's works just "well enough" does come at a cost, and if security is a real concern (as it increasingly is), then that sort of attitude is not going to cut it. But since the cost often comes later, I don't think that it's at all clear that we're going to really see a shift towards languages that prevent such bugs. Up front costs tend to have a powerful impact on decision making - especially when the cost that could come later is theoretical rather than guaranteed.

Now, given that D is also a very _productive_ language to write in, it stands to reduce up front costs as well, and that combined with its ability to reduce the theoretical security costs, we could have a real win, but with how entrenched C and C++ are and how much many companies are geared towards not caring about security or software quality so long as the software seems to get the job done, I think that it's going to be a _major_ uphill battle for a language like D to really gain mainstream use on anywhere near the level that languages like C and C++ have. But for those who are willing to use a language that makes it harder to write code with memory safety issues, there's a competitive advantage to be gained.

- Jonathan M Davis

D wasn't ready for mainstream adoption until quite recently I think. The documentation for Phobos when I started looking at D in 2014 was perfectly clear if you were more theoretically minded, but not for other people. In a previous incarnation I tried to get one trader who writes Python to look at D and he was terrified of it because of the docs. And I used to regularly have compiler crashes and ldc was always too far behind dmd. If you wanted to find commercial users there didn't seem to be so many and so hard to point to successful projects in D that people would have heard of or could recognise - at least not enough of them. Perception has threshold effects and isn't linear. There wasn't that much on numerical front either. The D Foundation didn't exist and Andrei played superhero in his spare time.

All that's changed now in every respect. I can point to the documentation and say we should have docs like that and with runnable tests /examples. Most code builds fine with ldc, plenty of numerical libraries - thanks Ilya - and perception is quite different about commercial successes. Remember what's really just incremental in reality can be a step change in perception.

I don't think the costs of adopting D are tiny upfront. Putting aside the fact that people expect better IDE support than we have, and that we have quite frequent releases (not a bad thing, but it's where we are in maturity) and some of them are a bit unfinished and others break things for good reasons, build systems are not that great even for middling projects (200k sloc). Dub is an amazing accomplishment for Sonke as one of many projects part time, but it's not yet so mature as a build tool. We have extern(C++) which is great, and no other language has it. But that's not the same thing as saying it's trivial to use a C++ library from D (and I don't think it's yet mature bugwise). No STL yet. Even for C compare the steps involved vs LuaJIT FFI. Dstep is a great tool but not without some friction and it only works for C.

So one should expect to pay a price with all of this, and I think most of the price is upfront (also because you might want to wrap the libraries you use most often). And the price is paid by having to deal with things people often take for granted, so even if it's small in the scheme of things, it's more noticeable.

A community needs energy coming into it to grow, but if there's too quick an influx of newcomers that wouldn't be good either. Eg if dconf were twice the size it would be a very different experience, not only in a positive way.

I think new things often grow not by taking the dominant player head on, but by growing in the interstices. By taking hold in obscure niches nobody cares about you gain power to take on bigger niches and over time turns out some of those niches weren't so unimportant after all. It's a positive for the health of D that it's dismissed and yet keeps growing; just imagine if Stroustrup had had a revelation, written a memo "the static if tidal wave" (BG 1995), persuaded the committee to deprecate all the features and mistakes that hold C++ back and stolen all D's best features in a single language release. A challenger language doesn't want all people to take it seriously because it doesn't have the strength to win a direct contest. It just needs more people to take it seriously.

Thr best measure of the health of the language and its community might be are more people using the language to get real work done and is it more or less helping them do so; and what is the quality of new people becoming involved. If those things are positive then if external conditions are favourable then I think it bodes well for the future.

And by external conditions I mean that people have gotten used to squandering performance and users' time - see Jonathan Blow on Photoshop for example. If you have an abundance of a resource and keep squandering it, eventually you will run out of abundance. Storage prices are collapsing, data sets are growing, Moore's Law isn't what it was, and even with dirt cheap commodity hardware it's not necessarily the case that one is I/O bound any more. Nvme drive does 2.5 GB /sec and we are happy when we can parse JSON at 200 MB /sec. People who misquote Knuth seem to write slow code, and life is too short to be waiting unnecessarily. At some point people get fed up with slow code.

Maybe it's wrong to think about there being one true inheritor of the mantle of C and C++. Maybe no new language will gain the market share that C has, and if so that's probably a good thing. Mozilla probably never had any moments when they woke up and thought hmm maybe we should have used Go instead, and I doubt people writing network services think maybe Rust would have been better.

I said to Andrei at dconf that principals rather than agents are much more likely to be receptive towards the adoption of D. If you take an unconventional decision and it doesn't work out, you look doubly stupid - it didn't work out and on top of that nobody else made that mistake : what were you thinking? So by far the best strategy - unless you're in a world of pain, and desperate for a way out - is to copy what everyone else is doing.

But if you're a principal - ie in some way an owner of a business - you haven't got the luxury of fooling yourself, not if you want to survive and flourish. The buck stops here, so it's a risk to use D, but it's also a risk not to use D - you can't pretend the conventional wisdom is without risk when it may not suit the problem that's before you. And it's your problem today and it's still your problem tomorrow, and that leads to a different orientation towards the future than being a cog in a vast machine where the top guy is measured by whether he beats earnings next quarter.

The web guys do have a lot of engineers but they have an inordinate influence on the culture. Lots more code gets written in enterprises and you never hear about it because it's proprietary and people aren't allowed to or don't have time to discuss it. And maybe it's simply not even interesting to talk about, which doesn't mean it's not interesting to you, and economically important.

D covers an enormous surface area - a much larger potential domain set than Go or Rust. Things are more spread out, hence the amusing phenomenon on Reddit and the like of people thinking that because they personally don't know anyone that uses D, nothing is happening and adoption isn't growing. So assessing things by adoption within the niches where people are chatty is interesting but doesn't tell you much.

I don't think most users post on the forum much. It's a subset of people that for whatever reasons like posting on the forum for intrinsic or instrumental reasons that do.

So if I am right about the surface area and the importance of principals then you should over time see people popping up from areas you had never thought of that have the power to make decisions and trust their own judgement because they have to. That's how you know the language is healthy - that they start using D and enough of them have success with it.

Liran at Weka had never heard of D not long before he based his company on it. I had never imagined a ship design company might use Extended Pascal, let alone that D might be a clearly sensible option for automated code conversion and be a great fit for new code.

And I am sure Walter is right about the importance of memory safety. But outside of certain areas D isn't in a battle with Rust; memory safety is one more appealing modern feature of D. To say it's important to get it right isn't to say it has to defeat Rust. Not that you implied this, but some people at dconf seemed to implicitly think that way.


Laeeth


Reply via email to