I've been fiddling around with some MD5 code in coreutils,
and discovered this weirdness with gcc:


#include <stdio.h>

void showsize(unsigned char arg[16])
{
        unsigned char stack[16];
        fprintf(stderr,"[sizeof(arg) = %d]\n",sizeof(arg));
        fprintf(stderr,"[sizeof(stack) = %d]\n",sizeof(stack));
}

int main()
{
        unsigned char m[16];
        showsize(m);
        return 0;
}


The output:

[sizeof(arg) = 4]
[sizeof(stack) = 16]

Can anyone explain why the size of arg differs from the array
on the stack?

Is C treating arg as a char* so that sizeof(arg) is sizeof(char*)?


cheers rickw


-- _________________________________ Rick Welykochy || Praxis Services

When choosing between two evils, I always like to take the one I haven't tried 
before.
     -- Mae West

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to