I've been fiddling around with some MD5 code in coreutils,
and discovered this weirdness with gcc:


#include <stdio.h>

void showsize(unsigned char arg[16])
{
        unsigned char stack[16];
        fprintf(stderr,"[sizeof(arg) = %d]\n",sizeof(arg));
        fprintf(stderr,"[sizeof(stack) = %d]\n",sizeof(stack));
}

int main()
{
        unsigned char m[16];
        showsize(m);
        return 0;
}
The output:
[sizeof(arg) = 4]
[sizeof(stack) = 16]

Can anyone explain why the size of arg differs from the array
on the stack?

Is C treating arg as a char* so that sizeof(arg) is sizeof(char*)?


You are right in assuming that arg is being treated as a char*, whereas stack is treated as a block of 16 bytes. It indeed looks odd, and there is no good justification for this weirdness. A possible rationale is that the sizeof operator simply reports the size of any object: and in the case of arg this means sizeof(char *). Remember that C passes only values and pointers into funtions; it does not pass the size of an array.

You may think that by writing
   void showsize(unsigned char arg[16])
you are supplying the size of the array to the function, but you are
not!  In fact the function is seeing only
    void showsize(unsigned char *arg)
and it is only a strange syntatical laxness in C that permits the first
form as an alternative to the second.

Chris

--
SLUG - Sydney Linux User's Group Mailing List - http://slug.org.au/
Subscription info and FAQs: http://slug.org.au/faq/mailinglists.html

Reply via email to