> i suspect the rationale was that, finally, C provided a way
> outside the preprocessor to give symbolic names to constants.
> why restrict that to int?
Because enum's have been int's since their inception?
I'm sympathetic to the underlying need, but making a fundamental
type of the language suddenly become variable does not seem to
be the right way of going about this.
E.g., what is the type of:
enum {
a = 1,
b = 2.44000000000000000000618549L,
c = 2.44F,
d = "this is weird",
e = 1LL<<62,
} foo;
How on earth do you switch() on it? And what's its sizeof()?
--lyndon