On Thursday, 13 March 2014 at 05:13:00 UTC, Manu wrote:

That's not the way business works, at least, not in my neck of the woods. Having been responsible for rolling out many compiler/toolset/library
upgrades personally, that's simply not how it's done.

That may be how gamedev industry works, because you are always on cutting edge technology. But for big companies in traditional industry (like IT or Oil & Gas or Finance or Aviation), they always work on stable tool set. Even undocumented features cannot change in an adhoc manner. I have worked for some of the top IT companies and I did not use D for my projects primarily because it was not stable.


It is assumed that infrastructural update may cause disruption. No sane business just goes and rolls out updates without an initial testing and
adaptation period.
Time is allocated to the upgrade process, and necessary changes to workflow
are made by an expert that performs the upgrade.

That happens for D1 to D2 migration and not from D2.64 to D2.65. Every few months you cannot expect the team to test and validate the compiler tool set.

In the case of controlled language feature deprecation (as opposed to the std.json example), it should ideally be safe to assume an alternative recommendation is in place, and it was designed to minimise disruption. In the case we are discussing here, the disruption is small and easily
addressed.

Languages are adopted by enterprises only when there is long term stability
to it. C code written 30 years back in K&R style still compiles without any problem. Please enhance the language but don't break existing code.


In my experience, C/C++ is wildly unstable.
I've been responsible for managing C compiler updates on many occasions, and they often cause complete catastrophe, with no warning or deprecation
path given!
Microsoft are notorious for this. Basically every version of MSC is
incompatible with the version prior in some annoying way.

I personally feel D has a major advantage here since all 3 compilers share the same front-end, and has a proper 'deprecated' concept (only recently
introduced to C), and better compile time detection and warning
opportunities.
Frankly, I think all this complaining about breaking changes in D is
massively overrated. C is much, much worse!
The only difference is that D releases are more frequent than C releases.
That will change as the language matures.

Also if something has to be deprecated, it should exist in that deprecated
state for at least for 5 years. Currently it is one year and for enterprise
customers that is a very short period.


This is possibly true. It's a tricky balancing act.

I'd rather see D take a more strict approach here, so that we don't end up in the position where 30-year-old D code still exists alongside 'modern' D code written completely differently, requiring to be compiled with a bunch
of different options.
The old codebases should be nudged to update along the way. I would consider it a big mistake to retain the ancient-C backwards compatibility.

Even I do not want D to have 30 years of backward compatibility. But that is something the C community has got used to. That is why I said we need atleast 5 year depreciation cycle.

Anyways, coming to your original issue - why can't you add "final:" in your class as suggested by Walter? Doesn't this solve your problem without changing the default behaviour?

- Sarath

Reply via email to