On Sunday, 26 August 2018 at 08:40:32 UTC, Andre Pany wrote:
On Saturday, 25 August 2018 at 20:52:06 UTC, Walter Bright wrote:
On 8/25/2018 3:52 AM, Chris wrote:
On Friday, 24 August 2018 at 19:26:40 UTC, Walter Bright wrote:
Every programmer who says this also demands new (and breaking) features.
"Every programmer who..." Really?

You want to remove autodecoding (so do I) and that will break just about every D program in existence. For everyone else, it's something else that's just as important to them.

For example, Shachar wants partially constructed objects to be partially destructed, a quite reasonable request. Ok, but consider the breakage:

  struct S {
    ~this() {}
  }

  class C {
    S s;

    this() nothrow {}
  }

I.e. a nothrow constructor now must call a throwing destructor. This is not some made up example, it breaks existing code:

  https://github.com/dlang/dmd/pull/6816

If I fix the bug, I break existing code, and apparently a substantial amount of existing code. What's your advice on how to proceed with this?

In the whole discussion I miss 2 really important things.

If your product compiles fine with a dmd version, no one forces you to update to the next dmd version. In the company I work for, we set for each project the DMD version in the build settings. The speed of DMD releases or breaking changes doesn't affect us at all.

Maybe I do not know a lot open source products but the amount of work which goes into code quality is extremely high for the compiler, runtime, phobos and related products. I love to see how much work is invested in unit tests and also code style.

DMD (and LDC and GDC) has greatly improved in the last years in various aspects.

But I also see that there is a lot of work to be done. There are definitely problems to be solved. It is sad that people like Dicebot leaving the D community.

Kind regards
Andre

Dicebot should speak for himself as he wishes. But I was entertained by the simultaneous posting by someone else of a blog post from a while back with him asking for comments on the early release of dtoh, a tool intended in time to be integrated into DMD given its design.

I don't think he was very happy about the process around DIP1000 but I am not myself well placed to judge.

In any case, languages aren't in a death match where there can be only one survivor. Apart from anything else, does anyone really think less code will be written in the future than in the past or that there will be fewer people who write code as part of what they do but aren't career programmers?

I probably have an intermediate experience between you and Jon Degenhardt on the one hand and those complaining about breakage. Some of it was self-inflicted because on the biggest project we have 200k SLoC a good part of which I wrote myself pretty quickly and the build system has been improvised since and could still be better. The Linux builder is a docker container created nightly and taking longer to something similar on Windows side, where funnily enough the bigger problems are. Often in the past little things like dub turning relative paths into absolute ones, creating huge paths that broke DMD path limit till we got fed up and decided to fix ourselves. (Did you know there are six extant ways of handling paths on Windows?)

Dub dependency resolution has been tough. It might be better now. I appreciate it's a tough problem but somehow eg maven is quick (it might well cheat by solving an easier problem).

And quite a lot of breakage in vibe. But nobody forces you to use vibe and there do exist other options for many things.

Overall though, it's not that bad depending on your use case. Everything has problems but also everyone has a different kind of sensitivity to different kinds of problems.

For me, DPP makes a huge difference because I now know it's pretty likely I can just #include a C library if that's the best option and in my experience it mostly just works.

The plasticity, coherence and readability of D code dominates the difficulties for quite a few things I am doing. Might not be the case for others because everyone is different. Cost of my time in the present context dominates cost of several programmers' time but I don't think thats a necessary part of why D makes sense for some things for us. I think by the end of this year we might have eleven people including me writing D at least sometimes, from only me about 18 months ago. That's people working from the office including practitioners who write code and add a handful of remote consultants who only write D to that.

There's no question from my perspective that D is much better than a year ago and unimaginably better from when I first picked it up in 2014. One can't be serious suggesting that D isn't flourishing as far as adoption goes.

The forum really isn't the place to assess what people using the language at work feel. Almost nobody working from our offices is active in the forums and that's the impression I get speaking to other enterprise users. People have work to do, unfortunately!

I wonder if the budget was there whether it would be possible to find someone even half as productive as Seb Wilzbach to help full-time, because whilst some of the problems raised are very difficult ones, others might just be a matter of (very high quality) manpower. Michael Parker's involvement has also made a huge difference to public profile of D.

I definitely think a stable version with backfixes ported would be great if feasible.

I don't really get the hate for betterC even though I don't directly use it myself in code I write. It's useful directly for lots of things like web assembly and embedded and a side effect from Seb's work on betterC testing for Phobos will I guessbe that it's much clearer how much can be used without depending on the GC because that's what betterC also implies. Is it really such an expensive effort ?

Beyond real factors it also helps with perception and social factors. I feel like I came across more HFT programmers or people who claim to be such in Reddit and who could, they say, never even stare too closely at a GC language than in the industry itself!

Would be great if Manu's work on STL and extern (C++) comes to fruition. I think DPP will work for much more of C++ in time, though it might be quite some time.

I wonder if we are approaching the point where enterprise crowd-funding of missing features or capabilities in the ecosystem could make sense. If you look at how Liran managed to find David Nadlinger to help him, it could just be in part a matter of lacking social organisation preventing the market addressing unfulfilled mutual coincidences of wants. Lots of capable people would like to work full time programming in D. Enough firms would like some improvements made. If takes work to organise these things. If I were a student I might be trying to see if there was an opportunity there.

Reply via email to