https://gcc.gnu.org/bugzilla/show_bug.cgi?id=87917
Richard Biener <rguenth at gcc dot gnu.org> changed:
What |Removed |Added
----------------------------------------------------------------------------
Status|UNCONFIRMED |NEW
Last reconfirmed| |2018-11-08
CC| |rguenth at gcc dot gnu.org
Ever confirmed|0 |1
--- Comment #1 from Richard Biener <rguenth at gcc dot gnu.org> ---
I think we have some duplicates here. dependence analysis use of int_cst_value
is fishy and it doesn't properly guard any of its inputs for its internal use
of 'int' as the underlying type for dependence distances... (think of
overflows,
etc.):
static tree
initialize_matrix_A (lambda_matrix A, tree chrec, unsigned index, int mult)
{
gcc_assert (chrec);
switch (TREE_CODE (chrec))
{
case POLYNOMIAL_CHREC:
A[index][0] = mult * int_cst_value (CHREC_RIGHT (chrec));
return initialize_matrix_A (A, CHREC_LEFT (chrec), index + 1, mult);
If CHREC_RIGHT doesn't fit in an 'int' or if multiplying by mult overflows
we're doomed (wrong-code, etc.). Making the lambda matrix entries
gmp values or widest_ints might be the (expensive) solution here.
int_cst_value needs a better name btw...
As said, there are dups.