> So what is supposed to be wrong with using a manifest constant
> instead of hard-coding "8" in various places?  As I recall,
> The Elements of Programming Style recommended this approach.

i see two problems with this sort of indirection.  if i see NBBY
in the code, i have to look up it's value.  NBBY doesn't mean anything
to me.  this layer of mental gymnastics that makes the code hard
 to read and understand.  on the other hand, 8 means something to me.

more importantly, it implies that the code would work with NBBY
of 10 or 12.  (c standard says you can't have < 8 ยง5.2.4.2.1.)
i'd bet there are many things in the code that depend on the sizeof
a byte that don't reference NBBY.

so this define goes 0 fer 2.  it can't be changed and it is not informative.

> Similar definitions have been in Unix system headers for
> decades.  CHAR_BIT is defined in <limits.h>. (Yes, I know
> there is a difference between a char and a byte.  Less well
> known, there is a difference between a byte and an octet.)

this mightn't be the right place to defend a practice by saying that
"unix systems have been doing it for years."

- erik

Reply via email to