On 2014-03-31 10:31:00 +0300, Heikki Linnakangas wrote: > On 03/23/2014 06:07 PM, Greg Stark wrote: > >On Sun, Mar 23, 2014 at 1:22 PM, Heikki Linnakangas <[email protected] > >>wrote: > > > >>The reason for the typedef is precisely that an enum is not guaranteed to > >>be one byte. Tom suggested getting rid of the typedef, but it's needed to > >>make sure it's stored as one byte. > >> > >>I'll go add a comment to it, explaining why it's needed. > > > >If we're not using the enum type perhaps it would be better to make it a > >bunch of #defines? The main advantage of enum types is that the debugger > >knows what values are legal and can decode them for you. > > Yeah, I think you're right. Changed it to use #defines, renamed the typedef > to GinTernaryValue as that's more descriptive, and added a brief comment > mentioning that a GinTernaryValue is compatible with a boolean.
It doesn't matter much in this case, but in general I find the reasoning here less than convincing. For one, debuggers have a much easier time providing the symbolic name (e.g. p WHATEVER_ENUM_VAL) than purely defines. Also, it's perfectly possible to switch to the enum in switch()es, and benefit from compiler warnings. Using defines prohibits that... Greetings, Andres Freund -- Andres Freund http://www.2ndQuadrant.com/ PostgreSQL Development, 24x7 Support, Training & Services -- Sent via pgsql-committers mailing list ([email protected]) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-committers
