https://gcc.gnu.org/bugzilla/show_bug.cgi?id=115154

--- Comment #16 from GCC Commits <cvs-commit at gcc dot gnu.org> ---
The trunk branch has been updated by Andrew Pinski <pins...@gcc.gnu.org>:

https://gcc.gnu.org/g:49c87d22535ac4f8aacf088b3f462861c26cacb4

commit r15-755-g49c87d22535ac4f8aacf088b3f462861c26cacb4
Author: Andrew Pinski <quic_apin...@quicinc.com>
Date:   Mon May 20 00:16:40 2024 -0700

    match: Disable `(type)zero_one_valuep*CST` for 1bit signed types [PR115154]

    The problem here is the pattern added in r13-1162-g9991d84d2a8435
    assumes that it is well defined to multiply zero_one_valuep by the
truncated
    converted integer constant. It is well defined for all types except for
signed 1bit types.
    Where `a * -1` is produced which is undefined/
    So disable this pattern for 1bit signed types.

    Note the pattern added in r14-3432-gddd64a6ec3b38e is able to workaround
the undefinedness except when
    `-fsanitize=undefined` is turned on, this is why I added a testcase for
that.

    Bootstrapped and tested on x86_64-linux-gnu with no regressions.

            PR tree-optimization/115154

    gcc/ChangeLog:

            * match.pd (convert (mult zero_one_valued_p@1 INTEGER_CST@2)):
Disable
            for 1bit signed types.

    gcc/testsuite/ChangeLog:

            * c-c++-common/ubsan/signed1bitfield-1.c: New test.
            * gcc.c-torture/execute/signed1bitfield-1.c: New test.

    Signed-off-by: Andrew Pinski <quic_apin...@quicinc.com>

Reply via email to