"Russel Winder" <rus...@russel.org.uk> wrote in message news:mailman.3055.1301814881.4748.digitalmar...@puremagic.com... >On Sat, 2011-04-02 at 18:20 -0500, Andrei Alexandrescu wrote: >> On 4/2/11 5:27 PM, ulrik.mikaels...@gmail.com wrote: >> > A D-newbie would probably be able to guess 0o for octal, but hardly >> > octal!. octal! breaks the rule of least surprise. >> >> I fail to infer how using the word "octal" for an octal literal is >> surprising at all. > >The problem is not that it is a poor solution in isolation, it is the >conflict between 0b... and 0x.. versus octal!... Why is octal being >discriminated against compared to binary and hexadecimal? >
To be perfectly blunt, it's because octal...well, sucks. It's useless compared to binary and hex and just isn't really deserving of a special notation anyway. Even more so since the rare uses of it are possible in D without it being part of the language. Binary and hex, OTOH, are useful (I use hex very frequently. And I would get a lot of use out of binary if I actually had time to do low-level work like I used to). That said, I woudn't have a problem with 0o... being used instead (Although I'd actually prefer 0c...). But I have a hard time understanding why people are making such a big deal out of something that practically no one ever uses anyway. Consistency is nice, sure, but when it's such a trival corner-case as octal: Why even care? This isn't the color of the bikeshed, this is the *shade* of color on the underside of the trim on a window that's one foot off the ground and completely blocked by a big tree.