Richard Burmeister [EMAIL PROTECTED] writes:
And if you pass in something of the wrong size to a function, how will
the function know how to interpret the bits that are on the stack?
Conclusion: check function prototypes in the Palm OS Reference and
always use the correct data types.
This is
"Richard Burmeister" [EMAIL PROTECTED] writes:
And if you pass in something of the wrong size to a function, how will
the function know how to interpret the bits that are on the stack?
Conclusion: check function prototypes in the Palm OS Reference and
always use the correct data types.
This
From: "Dave Carrigan" [EMAIL PROTECTED]
Subject: Re: UIntxx question
"Richard Burmeister" [EMAIL PROTECTED] writes:
And if you pass in something of the wrong size to a function, how will
the function know how to interpret the bits that are on the stack?
Conclu
Of course, you are right, but you are counting on the
compiler to coerce the
16 bits to 8, which obviously can cause data loss, and you
are assuming your
compiler does this coercion, which I never rely on since I
regularly program
in 5 or 6 different environments. It's much safer to
I have a question: whats the difference between a UInt16, 32, or 8? If a
function needs or returns a UInt16, for instance, can I just define it as a
UInt? (i.e.-
UInt mode = dmModeReadWrite;
UInt creator = myCrid;
UInt type = 'dataType';
DmOpenDatabaseByTypeCreator(type, creator, mode);
the
From: [EMAIL PROTECTED]
Subject: UIntxx question
I have a question: whats the difference between a UInt16, 32, or 8? If a
function needs or returns a UInt16, for instance, can I just define it as
a
UInt? (i.e.-
UInt32 is 32 bits == 4 bytes
UInt16 is 16 bits == 2 bytes
UInt8 is 8 bits