Hi,

I recently did some analysis on the automatic vectorization of gcc, I found that singed char can not be vectorized in the following code.

---
#define ITERATIONS 1000000

#if defined(do_reduce_signed_char)
#define TYPE signed char
#elif defined(do_reduce_unsigned_char)
#define TYPE unsigned char
#else
#error bad define
#endif

#define SIZE (16384/sizeof(TYPE))

static TYPE x[SIZE] __attribute__ ((aligned (16)));

void obfuscate(void *a, ...);

static void __attribute__((noinline)) do_one(void)
{
    unsigned long i;
    TYPE a = 0;

    obfuscate(x);

    for (i = 0; i < SIZE; i++)
        a += x[i];

    obfuscate(x, a);
}

int main(void)
{
    unsigned long i;

    for (i = 0; i < ITERATIONS; i++)
        do_one();

    return 0;
}
---
If we use the following command line

gcc reduce.c -Ddo_reduce_unsigned_char -Ofast -c -S -fdump-tree-vect-details

We can see that this code can be vectorized under the unsigned char data type.
If we use the following command

gcc reduce.c -Ddo_reduce_signed_char -Ofast -c -S -fdump-tree-vect-details

We can see that this code cannot be vectorized under the singed char data type.
I found in the below code for singed char
---
a += x[i];
---
Will do something like the following conversion.
---
a = (signed char) ((unsigned char) x[i] + (unsigned char) a);
---
As a result, the reduction in the code cannot be effectively identified.
Can we vectorize the code like the above when the data type is signed char ?

Thanks,
Lijia He

Reply via email to