Hello All!
This is my first posting to [email protected]<mailto:[email protected]>!
I have a StarTech '1 Port RS232 Serial over IP Ethernet Device Server,'
StarTech ID: NETRS232_1... Problem scenario (long!) and question coming up...
I have a laboratory instrument machine that sends data via an RS-232 serial
communication port and it must be received by a Laboratory Information System
(LIS) Web application on our Ethernet TCP/IP network with it's own IP address
...
In my testing on a TCP/IP connection I use Mina's NioSocketConnector,
SocketSessionConfig and InetSocketAddres. I have a test app simulating our lab
instrument on my Mac on the Ethernet to that sends "frames" of data in varying
size from 20 to 240 bytes directly to our LIS on the Ethernet. The LIS
receives everything, no problem. But …
I have a similar test app on a PC simulating our lab machine using Mina's
SerialConnector, SerialSessionConfig and SerialAddress. It sends "frames" of
data in varying size from 20 to 240 bytes through the COM1 serial
communications port, through the StarTech device and on to our LIS on the
Ethernet ...
This time, if the "frames" are greater than 56 bytes they appear to be broken
up into 56 byte "chunks" received as such by the LIS, which is a problem. The
only differences between the TCP/IP communication and the serial communication
are:
a) The test apps slightly differ. One makes a TCP socket connection with the
LIS (no data problems) and the other makes a Serial Communications connection
(data problems)
b) The NETRS232_1 StarTech device.
I have debugged (a) and so far cannot determine what the problems are. I
contacted StarTech tech support and we have eliminated their device as a
problem.
I do believe buffering is the problem. Mina's socket connection is not a
problem, so it has to be Mina's serial communications connection. The buffer
sizes I am using should be more than ample. I have the IoBuffer set to auto
expand. Here is some example code:
final IoConnector connector = new SerialConnector();
final SerialSessionConfig serialSessionConfig = (SerialSessionConfig
)connector.getSessionConfig();
serialSessionConfig.setOutputBufferSize( 4096 );
serialSessionConfig.setReadBufferSize( 4096 );
serialSessionConfig.setMinReadBufferSize( 4096 );
serialSessionConfig.setIdleTime( IdleStatus.BOTH_IDLE, 10 ); // 10
seconds
serialSessionConfig.setInputBufferSize( 4096 );
connector.setConnectTimeoutMillis( TimeUnit.SECONDS.toMillis( 30L ) );
// 30 seconds
connector.setHandler( myIoHandler );
final SerialAddress portAddress = new SerialAddress(
"COM1",
9600, // bauds rate
SerialAddress.DataBits.DATABITS_8,
SerialAddress.StopBits.BITS_1,
SerialAddress.Parity.NONE,
SerialAddress.FlowControl.NONE );
final ConnectFuture connectFuture = connector.connect( portAddress );
connectFuture.awaitUninterruptibly();
final IoSession ioSession = connectFuture.getSession();
ioSession.getConfig().setUseReadOperation( true );
Here is where the IoBuffer is allocated in my ProtocolEncoderAdapter:
final IoBuffer ioBuffer = IoBuffer.allocate( frameBytes.length, false
); // Gets a heap buffer
ioBuffer.setAutoExpand( true );
If a "frame" of 21 bytes are sent, 21 are received, no problem.
If a "frame" of 47 bytes are sent, 47 are received, no problem.
If a "frame" of 60 bytes are sent, 55 are received in one "chunk" and then the
remaining 5 bytes are received in a second "chunk."
If a "frame" of 106 bytes are sent, 55 are received in one "chunk" and then the
remaining 51 bytes are received in a second "chunk."
Basically, any data than 55 bytes sent is splitting up.
Any ideas on what I should look for in code? Is my example code, above wrong
somewhere?
Thank you in advance for any help!
Garry Archer
Systems Programmer
Department of Pathology
Yale School of Medicine