Hi Vladyslav,

oh ... a custom driver. In that case it will definitely be tricky to help you 
unless we can have a look at the code.

Is this something you consider bringing into the PLC4X project, or something 
that's meant to stay outside of the project?

I guess this is the first time such a question has come up ;-)

With integrations, I was referring to: Camel, Kafka, Edgent, NiFi, ... 
integrations that the PLC4X provides. But I guess you answered the question and 
you're not using any of them.

The connection pool does a little more. Before returning a connection it checks 
if it's still alive and if it's not, it creates a new one. 

Chris



Am 16.09.20, 17:39 schrieb "Vladyslav Milutin" <v.milu...@aegas.io>:

    Hi Christofer,

    Thanks for your quick response.
    I'm using a custom driver which extends GeneratedDriverBase, for connection
    I use a simple call to .connect, I know that you have PooledDriverManager,
    but it won't have the same issue if connection was reset by peer, since
    it's just look up for the specific connection?
    As
    integrations: plc4j-transport-tcp, plc4j-api, plc4j-spi, 
plc4j-connection-pool
    and other code generation and build utils. Or by integration you mean
    frameworks? If yes, Spring Frameworks.

    Kind regards,
    Vlad

    ср, 16 сент. 2020 г. в 17:28, Christofer Dutz <christofer.d...@c-ware.de>:

    > Hi Vladyslav,
    >
    > could you please tell us which driver and which version you are using?
    > Also it would be interesting if you are using any integration modules?
    >
    > Chris
    >
    > Am 16.09.20, 14:36 schrieb "Vladyslav Milutin" <v.milu...@aegas.io>:
    >
    >     Hello guys,
    >
    >     I'm writing to you with a hope that you can help me with exception
    >     handling.
    >     Currently after a long time connection can be reset by peer. See
    > stacktrace
    >     below.
    >
    >     I've tried to add a custom ChannelHandler which Overrides
    > exceptionCaught()
    >     and add it in Driver#initializePipeline() see code below. Also has
    > tried to
    >     add a channel that can be obtained from DefaultNettyPlcConnection. And
    > none
    >     of them actualy was added to the pipeline where this exception was
    > thrown.
    >
    >     plc4x version: 0.7.0
    >
    >     StatckTrace:
    >     2020-09-16 13:50:03.340 WARN  [nioEventLoopGroup-58-1]
    >     [io.netty.channel.DefaultChannelPipeline] onUnhandledInboundException
    > - An
    >     exceptionCaught() event was fired, and it reached at the tail of the
    >     pipeline. It usually means the last handler in the pipeline did not
    > handle
    >     the exception.
    >     java.io.IOException: Connection reset by peer
    >       at java.base/sun.nio.ch.FileDispatcherImpl.read0(Native Method)
    >       at java.base/sun.nio.ch
    > .SocketDispatcher.read(SocketDispatcher.java:39)
    >       at java.base/sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:276)
    >       at java.base/sun.nio.ch.IOUtil.read(IOUtil.java:233)
    >       at java.base/sun.nio.ch.IOUtil.read(IOUtil.java:223)
    >       at java.base/sun.nio.ch
    > .SocketChannelImpl.read(SocketChannelImpl.java:358)
    >       at io.netty.buffer.PooledByteBuf.setBytes(PooledByteBuf.java:253)
    >       at
    > io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1133)
    >       at
    >
    > 
io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:350)
    >       at
    >
    > 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:148)
    >       at
    >
    > 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
    >       at
    >
    > 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
    >       at
    >
    > 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
    >       at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
    >       at
    >
    > 
io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
    >       at
    >
    > io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
    >       at
    >
    > 
io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
    >       at java.base/java.lang.Thread.run(Thread.java:834)
    >
    >     Driver#initializePipeline:
    >             try {
    >                 final Channel channel =
    >     channelFactory.createChannel(this.handler);
    >                 channelFactory.initializePipeline(channel.pipeline());
    >
    >             } catch (PlcConnectionException e) {
    >                 log.error("Failed to create channel");
    >             }
    >
    >     ChannelHandler:
    >         @Override
    >         public void exceptionCaught(ChannelHandlerContext ctx, Throwable
    > cause)
    >     {
    >             log.warn("ExceptionCaught in worker: ctx = [{}], cause = [{},
    > {}],
    >     workerName = [{}]",
    >                     ctx, cause.getClass(), cause.getMessage(), 
workerName);
    >             if (cause instanceof ConnectTimeoutException) {
    >                 log.warn("ConnectionTimeout caught: workerName = [{}]",
    >     workerName);
    >             }
    >             if ((cause instanceof IOException) &&
    >     cause.getMessage().contains("Connection reset by peer")) {
    >                 log.warn("Connection reset by peer caught: workerName =
    > [{}]",
    >     workerName);
    >             } else {
    >                 log.info("Unexpected exception caught: workerName = [{}]",
    >     workerName);
    >             }
    >
    >             this.callback.accept(cause);
    >         }
    >
    >     DefaultNettyPlcConnection#channel:
    >     log.info("Trying to get connection channel: worker name = [{}]",
    >     this.workerName);
    >             final Channel channel = ((DefaultNettyPlcConnection)
    >     this.connection).getChannel();
    >             log.info("Channel obtained successfully. Adding custom
    >     channelHandler to it: channel = [{}], workerName = [{}]", channel,
    >     this.workerName);
    >             channel.pipeline().addLast(this.channelHandler);
    >             log.info("ChannelHandler added: channel = [{}],
    > channelHandler =
    >     [{}], workerName = [{}]", channel, this.channelHandler,
    > this.workerName);
    >
    >     Kind regards,
    >     Vlad
    >
    >

Reply via email to