https://gcc.gnu.org/bugzilla/show_bug.cgi?id=96921

Jakub Jelinek <jakub at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
                 CC|                            |jakub at gcc dot gnu.org

--- Comment #1 from Jakub Jelinek <jakub at gcc dot gnu.org> ---
On:
_Bool
foo (_Bool a, _Bool b)
{
  int c = 1 - a;
  int d = 1 - b;
  int e = c & d;
  return 1 - e;
}

_Bool
bar (_Bool a, _Bool b)
{
  int c = 1 - a;
  int d = 1 - b;
  _Bool e = c & d;
  return 1 - e;
}

_Bool
baz (_Bool a, _Bool b)
{
  _Bool c = 1 - a;
  _Bool d = 1 - b;
  _Bool e = c & d;
  return 1 - e;
}
we are able to optimize just baz.  Rather than duplicating the bit_not vs.
bit_and/bit_ior etc. simplifications, I wonder if it wouldn't be better to
perform type demotion here, see that 1 - x is used in a boolean context and
that
x is ssa_name_has_boolean_range and turn that into bit_not on _Bool.  Similarly
for the bit_and/ior/xor whose result is used in a boolean context and where
both operands are ssa_name_has_boolean_range.  Thoughts on that?

Reply via email to