Alright, my first foray into using the serial port is driving me insane.  I
am being told by SerReceiveCheck() that I have data waiting in the receive
queue, but when I call SerReceive() to get it, I get 0 bytes back and
serErrTimeOut.

One thing I am curious about is the timeout parameter to SerReceive().  It
is documented as "Interbyte timeout in ticks, 0 for none, -1 forever".  What
is the difference between no timeout and forever?  I have tried all kinds of
values here, and always get the same result.  The example I am looking off of
uses 0.

I am clearly not understanding something.  If I am only asking SerReceive()
for the number of bytes already known to be in the receive queue, when does
a timeout come into play?  Even with a timeout, shouldn't I receive at least
some number of characters from the queue?

Thanks for any advice :)

-- 
Michael Pearce <[EMAIL PROTECTED]>                           +1 314 386 0663
Coreth Consulting, Inc.                              St. Louis, Missouri, USA

Reply via email to