Hi,
When trying to solve some project Euler problem, I declared something like that:
import tables
const N = 16
type Digit = range['0'..'9']
var counters: array[N, TableCount[Digit]]
var exclusion: array[N, set[Digit]]
. . .
Run
and got the usual warning message when dealing with a range which doesn't
include the null value (here '0'):
_Warning: Cannot prove that 'counters' is initialized. This will become a
compile time error in the future. [ProveInit]_
Of course, I could use _char_ instead of _Digit_ , but for the sets using a
_char_ will be less efficient (256 bits for each set instead of 9 bits rounded
to 16). In this case, this is not the memory cost which would be a problem, but
the performance. And anyway using _char_ instead of _Digit_ is ugly.
Using {.noInit.} doesn't change anything, so, for now, I have to live with the
warning.
But, is there a better way to do this?