When Enum was being designed one of the questions considered was where
to start autonumbering: zero or one.
As I remember the discussion we chose not to start with zero because we
didn't want an enum member to be False by default, and having a member
with value 0 be True was discordant. So the functional API starts with
1 unless overridden. In fact, according to the Enum docs:
The reason for defaulting to ``1`` as the starting number and
not ``0`` is that ``0`` is ``False`` in a boolean sense, but
enum members all evaluate to ``True``.
However, if the Enum is combined with some other type (str, int, float,
etc), then most behaviour is determined by that type -- including
boolean evaluation. So the empty string, 0 values, etc, will cause that
Enum member to evaluate as False.
So the question now is: for a standard Enum (meaning no other type
besides Enum is involved) should __bool__ look to the value of the Enum
member to determine True/False, or should we always be True by default
and make the Enum creator add their own __bool__ if they want something
different?
On the one hand we have backwards compatibility, which will take a
version to change.
On the other hand we have a pretty basic difference in how zero/empty is
handled between "pure" Enums and "mixed" Enums.
On the gripping hand we have . . .
Please respond with your thoughts on changing pure Enums to match mixed
Enums or any experience you have had with relying on the "always True"
behaviour or if you have implemented your own __bool__ to match the
standard True/False meanings or if you have implemented your own
__bool__ to match some other scheme entirely.
--
~Ethan~
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com