On Thursday, 20 November 2014 at 16:34:12 UTC, flamencofantasy
wrote:
My experience is totally the opposite of his. I have been using
unsigned for lengths, widths, heights for the past 15 years in
C, C++, C# and more recently in D with great success. I don't
pretend to be any kind of authority though.
C# doesn't encourage usage of unsigned types and warns that they
are not CLS-compliant. You're going against established practices
there. And signed types for numbers works wonders in C# without
any notable problem and makes reasoning about code easier as you
don't have to manually check for unsigned conversion bugs
everywhere.
The article you point to is totally flawed and kinda wasteful
in terms of having to read it; the very first code snippet is
obviously buggy.
That's the whole point: mixing signed with unsigned is bug-prone.
Worse, it's inevitable if you force unsigned types everywhere.