I think I understand you, and I try to clarify this problem by example:
when using "LengthPrefixedBinaryTCPClientImpl" as the "TcpClient classname", 
the input in "Text to send" should be hex-encoded string.
i.e. if you want to send 1-byte message of 'A' to server, so the "Text to send" 
should be "41" (hex-representation of 'A')
and what Jmeter do is:
1, caculate the actual length of contents in "Text to send", that is 2/2 = 1 
('A' occupys 1 byte, but representation "41" occupys 2 byte, so 2/2)
2, populate the value of lengh(here is 1) in 2 bytes and send to your server 
(supposed u r using default, that is, tcp.binarylength.prefix.length=2 in 
jmeter.properties)
3, send the 1-byte message body to ur server
that's all.




At 2014-06-13 00:14:26, "-Vinoth raj" <[email protected]> wrote:
>Hi All,
>
>I am trying to send a message to my server application developed in C++
>using LengthPrefixedBinaryTCPClientImpl class.
>
>On the server side I get the length and data in two recv (socket function)
>calls.
>I presume that normally this is not the case as the first two bytes of the
>data itself should contain the data length.
>
>EXAMPLE:
>For example, if I send 131 bytes data Jmeter should send 133 bytes with the
>first two bytes containing the data length.
>On enabling debug mode for the TCP sampler I found that the actual data
>sent is logged as 131.
>
>Also, strangely if I run both Jmeter and the server application on same
>machine I see the expected behaviour. I actually get 133 bytes in a single
>recv call.
>
>I would appreciate if anyone can shed light on the actual behaviour of the
>Jmeter.
>
>Vinoth

Reply via email to