Rainer Deyke:
> How would this work?<
> Hash value calculated on first access and stored in the AA?  Can't do,
> AA is immutable.

You are right, there's a problem here, even once you have added an ".idup" to 
AAs.
A hash value isn't data, it's metadata, so you may have a lazily computed 
mutable metadata of immutable data. Once computed the hash value essentially 
becomes an immutable.
You can think about a situation where two threads want to use such immutable AA 
(iAA). There's no need to copy such data, because it's immutable. Both may try 
to write the mutable hash value, but it's the same value, so no controls are 
necessary.
If you have a pure function you may want to give it an array of such iAA, and 
then the pure function may put such iAAs into a set/AA inside to compute 
something. Are immutable functions allowed to take as arguments (beside the 
data of the iAAs) the immutable future result of a deterministic pure 
computation performed on immutable data? I think such things are named 
"immutable futures". Essentially it's a form of lazy & pure computation, and 
it's done often for example in Haskell and Scheme.
It's a small extension of the framework of immutability, and it may lead to 
many uses.
For example in Haskell all data is immutable, but not everything is computed 
up-front. The compiler is allowed to reason about immutable data that isn't 
computed yet. This for example allows to manage an infinite stream of 
(immutable, but lazily computed) prime numbers.
So in haskell even a generator function like xprimes() of my dlibs can be 
thought as immutable, despite it doesn't compute all the prime numbers at the 
start.
Such kind of lazily computed immutable values become very useful in a 
language/compiler that has a native support of deep immutable data and pure 
functions.
I don't know if in D2.x you can already have a function with a lazy immutable 
input argument:
pure int foo(immutable lazy x) {...}
The hash value can be though as the result of one of such pure immutable lazy 
functions :-)

Bye,
bearophile

Reply via email to