On Saturday, 14 April 2012 at 10:38:45 UTC, Silveri wrote:
On Saturday, 14 April 2012 at 07:52:51 UTC, F i L wrote:
On Saturday, 14 April 2012 at 06:43:11 UTC, Manfred Nowak wrote:
F i L wrote:

4) use hardware signalling to overcome some of the limitations
impressed by 3).

4) I have no idea what you just said... :)

On Saturday, 14 April 2012 at 07:58:44 UTC, F i L wrote:
That's interesting, but what effect does appending an invalid char to a valid one have? Does the resulting string end up being "NaS" (Not a String)? Cause if not, I'm not sure that's a fair comparison.

The initialization values chosen are also determined by the underlying hardware implementation of the type. Signalling NANs (http://en.wikipedia.org/wiki/NaN#Signaling_NaN) can be used with floats because they are implemented by the CPU, but in the case of integers or strings their aren't really equivalent values.

I'm sure the hardware can just as easily signal zeros.


On Saturday, 14 April 2012 at 07:45:58 UTC, F i L wrote:
My original post was inspired by me showing my D code to another C# guy earlier, and coming up with poor explanations as to why floats where required to be defaulted in my math lib. His reaction what along the lines of my first post.

I think the correct mindset when working in D is to think that "all variables should be initialized" and if you get incorrect calculations with zero values, division by zero errors or nan errors the most likely mistake is often that this guideline was not followed.

Like I said before, this is backwards thinking. At the end of the day, you _can_ use default values in D. Given that ints are defaulted to usable values, FP Values should be as well for the sake of consistency and convenience.

You can't force new D programmers to follow a 'guidline' no matter how loudly the documentation shouts it (which is barely does at this point), so said guideline is not a dependable practice all D will follow (unless it's statically enforced)... nor _should_ the learning curve be steepened by enforcing awareness of this idiosyncrasy.

The correct mindset from the compilers perspective should be: "people create variables to use them. What do they want if they didn't specify a value?"

therefor our mindset can be: "I defined a variable to use. Should be zero so I don't need to set it."

Reply via email to