From: "Dr. Vesselin Bontchev" <[EMAIL PROTECTED]>
> Folks, note that all this still doesn't answer Scott's original question.
Even if he's computing the string size incorrectly, why does his program
behave differently under the simulator and on the real device? It ought to
either fail in both cases or work correctly in both cases.
>
Not if the mis-calculation is used as the basis for a string
operation.  If, for example, that length was used to do a
StrNCopy or MemCopy and produced a non-terminated
string the behaviour of code that used that string would depend
on what followed it in memory.  The memory allocation
might return clean memory on the simulator and buffer
crap on a device (although I'm not as confident about this
with the simulator as I was with the emulator because the
emulator was known to produce memory allocation with
certain characteristics while the Simulator is, in theory, the
same OS code as the device).

Still, when it comes to how an app functions when the wrong
length is given for a buffer it's easy for the answer to depend
on trivia such as whether the device was reset before the test,
whether the ROM used in the simulator is *exactly* the same
version as the code on the device, whether the protocol used
is exactly the same (this is, after all, a hangup that follows giving
the OS the wrong length for the data being sent - protocol settings
might determine whether the connection survives this or gets into
an invalid state)..  My bet is that it would be possible to identify
the source of the difference, but that it might take weeks of work,
some seriously low-level debugging and disassemblies of the
NetLib code on both the device and the Simulator.

Chris Tutty


-- 
For information on using the Palm Developer Forums, or to unsubscribe, please 
see http://www.palmos.com/dev/support/forums/

Reply via email to