On Fri, Jun 21, 2019 at 06:07:59AM +0000, Yatheendra via Digitalmars-d-learn 
> Am I mistaken in saying that we are conflating:
>    "anything that is logically const should be declared const"

No, not in D.  D does not have logical const; it has "physical" const,
which is a strict subset of logical const, and therefore there are some
uses of logical const for which D's const is unsuitable.

>    // makes perfect sense
>    // e.g. the lowest 2, and some branches of the 3rd and 4th, levels
>    // of members (and a subset of the overall methods) in a 5-deep type 
> hierarchy are const
> with:
>    "most code/data should be declared const"
>    // no! isn't efficient code all about mutation?
>    // no grounds for, e.g.: "ideally, no more than 40% of code should be
> doing mutation"

Actually, optimizers work best when there is minimal mutation *in the
original source*.  The emitted code, of course, is free to use mutation
however it wants.  But the trouble with mutation at the source level is
that it makes many code analyses very complex, which hinders the
optimizer from doing what it might have been able to do in the absence
of mutation (or a reduced usage of mutation).  Aliasing is one example
that hampers optimizers from emitting optimal code. Aliasing plus
mutation makes the analysis so complex that the optimizer has a hard
time deciding whether a particular construct can be optimized away or

Having minimal mutation in the original source code allows the optimizer
to make more assumptions, which in turn leads to better optimizations.
It also makes the source code easier to understand.  Paradoxically,
having less mutation in the source code means it's easier for the
compiler to optimize it into mutation-heavy optimal code -- because it
doesn't have to worry about arbitrary mutations in the source code, and
therefore can be free(r) to, e.g., eliminate redundant copies, redundant
movement of data, etc., which ultimately results in in-place
modification of values, i.e., mutation-heavy emitted code.

Conversely, if the source code is heavy on mutations, then the compiler
cannot confidently predict the overall effect of the mutations, and
therefore is forced to err on the safe side of assuming the worst, i.e.,
don't apply aggressive optimizations in case the programmer's mutations
invalidate said optimizations. The result is less optimal code.

> The inability to have a const caching object seems correct. The way
> around would be to have a wrapper that caches (meh). If that is not
> possible, then maybe caching objects just aren't meant to be const by
> their nature? Isn't memoize a standard library feature? I should look
> at it, but I wouldn't expect it to be const.

It's not as simple as it might seem.  Here's the crux of the problem:
you have an object that logically never changes (assuming no bugs, of
course).  Meaning every time you read it, you get the same value, and so
multiple reads can be elided, etc.. I.e., you want to tell the compiler
that it's OK to assume this object is const (or immutable).

However, it is expensive to initialize, and you'd like it to be
initialized only when it's actually needed, and once initialized you'd
like it to be cached so that you don't have to incur the initialization
cost again.  However, declaring a const object in D requires
initialization, and after initialization it cannot be mutated anymore.
This means you cannot declare it const in the first place if you want

It gets worse, though.  Wrappers only work up to a certain point.  But
when you're dealing with generic code, it becomes problematic.  Assume,
for instance, that you have a type Costly that's logically const, but
lazily initialized (and cached).  Since you can't actually declare it
const -- otherwise lazy initialization doesn't work -- you have to
declare it mutable.  Or, in this case, declare a wrapper that holds a
const reference to it, say something like this:

        struct Payload {
                // lazily-initialized data

        struct Wrapper {
                const(Payload)* impl;

However, what if you're applying some generic algorithms to it?  Generic
code generally assume that given a type T, if you want to declare a
const instance of it, you simply write const(T).  But what do you pass
to the generic function? If you pass Wrapper, const(Wrapper) means
`impl` cannot be rebound, so lazily initialization fails.  OK, then
let's pass const(Payload) directly.  But that means you no longer have a
wrapper, so you can't have lazy initialization (Payload must be
constructed before you can pass it to the function, thus it must be
eagerly initialized at this point).

It's an impasse.  Cached / lazily-initialized objects and D's const
simply don't mix.  Well, you can try to mix them, but it's like trying
to mix water and oil.  They just don't work well together.


Notwithstanding the eloquent discontent that you have just respectfully 
expressed at length against my verbal capabilities, I am afraid that I must 
unfortunately bring it to your attention that I am, in fact, NOT verbose.

Reply via email to