On Saturday, March 31, 2018 17:32:10 H. S. Teoh via Digitalmars-d wrote: > On Sat, Mar 31, 2018 at 07:38:06PM -0400, Andrei Alexandrescu via > Digitalmars-d wrote: [...] > > > Once we have that, we can encapsulate desirable abstractions (such as > > @nogc safe collections that work in pure code), regardless of how > > difficult their implementations might be. It seems that currently > > this(this) does not allow us to do that. > > What exactly is it about this(this) that blocks us from doing that? > > Removing this(this) is going to be a huge breaking change far bigger > than, say, removing autodecoding ever will be.
Well, as far const goes, you have the inherent problem that the object you have in this(this) has already been initialized, and you can't mutate const. I think that other attribute issues are mostly implementation issues. And as far as auto-decoding goes, removing postblit constructors would actually be far easier, because it's simply a matter of deprecating a particular function. Some traits that check for postblits would have to be adjusted, but all those really care about is that there's extra code beyond the default copying code, and they could easily be made to work with both postblits and whatever the new solution is at the same time until postblits are actually gone. Auto-decoding on the other hand, affects far more than just the functions in std.range.primitives, and there is no clear deprecation path. So, while it _might_ be true that deprecating postblits would break more code (and I'm honestly not convinced that it would), postblit constructors would actually have a clean deprecation path. So, while it breaks code, it does so in a way that can easily be managed, whereas the removal of auto-decoding is not straightforward at all. It could be done by just flipping the switch so-to-speak, but AFAIK, no one has yet presented a workable deprecation path. And ultimately, I think that _that_ is what's preventing us from fixing auto-decoding. If a clean deprecation path were found, then we could discuss whether the resulting breakage would be worth it, but without a clean deprecation path, I don't see how we could ever do it. Also, unless we can fix postblit constructors (and the evidence thus far is that if we can, it's way too complicated - e.g. Kenji's attempt several years ago was rejected because it was way too complicated), if we don't switch to a differnt solution, we're talking about permanently not supporting copying const types that require user-defined copying. > A lot of us here have essentially given up on const except for a few > very narrow cases. The transitive nature of const makes it extremely > difficult to work with in the general case, even though the simplest use > cases are workable. > > One of the biggest stumbling blocks is that whenever ranges are > involved, const is practically out of the question, because even though > it can be made to work for most cases, there will almost always be that > one pathological case where it's impossible / too hard to work around, > and that ruins it for everything else, so that it's much simpler to just > avoid it altogether. I don't think that it's even the case that it can be made to work in most cases - or if it can, it involves a lot of static ifs. The range API does not require const for _anything_ (and really can't due to how restricted const is), so _no_ generic range-based code can assume that even something like length or empty can be called if the range is const or inout. As such, the only ranges that can mark anything with const are ones that either aren't generic or which have a bunch of static ifs presenting const and non-const versions depending on the template arguments, and IMHO, that's just not workable. I've done it before, and it's a mess. As such, ranges and const really don't work together at all. const really only works when you're dealing with a very constrainted set of types where you can actually guarantee that they work with const. > > * pure is difficult > > [...] > > The one nagging question I've been having about pure is: how much are we > actually taking advantage of the guarantees provided by pure? We have > developed very clever ways of extending the traditional definition of > pure and invented creative ways of making more things pure, which is all > great. But AFAIK the only place where it's actually taken advantage of > is to elide some redundant function calls inside a single expression. > And perhaps infer uniqueness in some cases for implicit casting to > immutable. > > While these are indisputably useful, they seem so far to be only > relatively minor benefits. If pure is indeed so difficult to support > generically, it begs the question, is it worth the effort just to gain > these niggling benefits? Whatever happened to larger-scale benefits > conferred by purity? Honestly, I think that the main benefit of pure is that if you know that a function is pure, you know that it doesn't access any global, mutable state except through its arguments. The secondary benefit is that it can allow for functions which construct immutable objects using mutable state and without casts (though such functions are usually small in scale and don't require that large portions of the code base be pure). I think that the idea that pure is going to result in compiler optimizations is mostly a joke. Not only can it only work with strongly pure functions (which most pure functions aren't and can't be), but the way it's currently implemented, it can only elide calls within a single statement (and it might actually be a single expression - I'm not sure which). Going farther than that requires code-flow analysis, which Walter is almost always against, so it's almost certainly not happening. And even if it did, I don't think that it would help much. How often do you call the same function with the same arguments within a single function body? I expect that that's pretty rare. The place that it would likely be of the most benefit would be math code, and even there, I'm not sure that it happens much. So, I think that the only large-scale benefit thet exists for pure and really can exist for pure is the fact that you know that the function doesn't access global, mutable state. Everything else it does is just gravy and too limited to be a "large-scale" benefit. Certainly, optimizations are clearly _not_ the main benefit of pure, since they almost don't exist. But over time, we have managed to add more gravy here and there as we've figured out assumptions that can be made based on pure (like the case where we can convert the result of a pure function to immutable). - Jonathan M Davis
