On Thursday, 31 July 2014 at 19:12:04 UTC, Walter Bright wrote:
Integers are sortable, period. That is not "input".
Ok so sorted ints are not "input", what else is not "input"? Where can I draw the line? And if I do use assert on an input, that is what? Undefined? I thought D was not supposed to have undefined behavior.
You're denying the existence of mathematical identities applied to symbolic math.
I am just going off of what you said and pointing out the consequences. You said it was a misuse to use asserts to verify inputs. So what is defined as an input?
int x = getIntFromSomeWhere(); // this is an input int y = x; // Is y an input? int z = x/2; // Is z an input? real w = sin(x); // Is w an input? int a = (x == 2)?1:0; // Is a an input?
I suggest revisiting the notion of program logic correctness vs input verification.
The proposed optimizations to assert have brought into question the logical correctness of assert, that is what this whole thread is about(well not originally but that is what it is about now, sorry bearophile). Until that is resolved I am no longer going to use assert.
