Ok, I am debugging this further.

It seems like my client sends a data packet to the server.
The server reads it properly.

But when my server sends data back to the client, the client only reads its own 
packet that it sent to the server previously.
It is only as many bytes as it sent.

So I am sending from client:
123000

When the server sends any packet back to the client, say "A"
The client reads in "1"instead of "A"

If I send"ABCD"
The client reads in "1230"

Is there something I should know about ChannelHandlerContext?
Is it possible for ChannelHandlerContext.write(bytebuffer) to be corrupted by 
input somehow?
I do not write to the ChannelHandlerContext anywhere else.

The whole system seems like it runs fine except for when I send in one 
particular place.
If I send the same packet twice, it sends the echo packet just once then 
resumes sending normally.
So I could just give up, cry anomoly, and send the packet twice and my program 
would work.

But I want to track this down while I can so it doesn't pop up elsewhere.

What could be causing an echo?  An echo that only happens when I .write 
different data out and only to the length of the data?

-- 
You received this message because you are subscribed to the Google Groups 
"Netty discussions" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/netty/15077709-0d1b-4deb-aa1e-7a89e479680d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to