"bearophile" wrote in message news:[email protected]...
What defines "possible"? A compiler switch that allows to define the
amount of compile time?
At the moment it's the criteria I listed in the bug report. Beyond this, it
will be limited by what constfolding is capable of. What the exact
limitations of constfolding are is an evolving part of the language.
> I feel like you're confusing this pull request with another enhancement.
> The discussion should be about whether this exact feature is worthwhile,
> not if some other feature would solve some other problems.
The comparison of the two ideas is useful, because they try to solve the
same problem.
I don't think they do. If you force evaluation of preconditions at compile
time, you force arguments (or properties of arguments) to be known at
compile time. With best-effort checking, you simply get to promote run-time
errors to compile-time.
I think that adding an undefined, growing, compiled-dependent list of
static checks that can cause unpredictable compilation times is not a good
idea. A more general solution has some advantages:
- It allows the user to choose what to run at compile-time and what at
run-time;
Again, these are orthagonal. I don't care if you define conditions that
must be met at compile-time, these are about promoting run-time conditions
to compile time.
- You can defined _exactly_ what it does in a book, unlike the growing
list;
IMO this is only a mild problem. If they can't be caught at compile time,
they will still be caught at run-time.
- Will not grow in features and implementation complexity as compilers
improve (and I think it's sufficiently simple);
I don't understand this. You don't want static checking to improve over
time? This is simply a type of static checking.
- It's going to be mostly the same in all D compilers;
Again, it's a promotion. It doesn't matter if every frontend doesn't
implement it, lesser frontends will just mean you catch it at runtime.
- It allows to specify and run arbitrary tests in CTFE instead of a set of
predefined by unknown simple tests. So you can use it for SafeIntegers
literals, and many other future and unpredictable purposes;
- It's explicitly visible in the code, because there is a syntax to denote
such static testing parts in the code.
This has nothing to do with promoting run-time checks to compile-time.
Implementing a special-cased solution instead of a more general and
principled solution could make the implementation of the better solution
more unlikely :-)
Maybe. But I think you should be seeing this a stepping stone, not
competition.
eg all developers like it when the compiler points out guaranteed wrong
code:
ubyte x = 0xFFF;
but many (including me) find it very annoying when the compiler makes you
alter code that it thinks _might_ be wrong:
ubyte x = y; // where y is a uint, that you know to be < 256