brampurnot commented on PR #4562:
URL: https://github.com/apache/eventmesh/pull/4562#issuecomment-1840450150
I'm not 100% sure what the problem is. But it basically stops without
showing any logs anymore when I run the JAR from the dist/plugin/connectors
folder. I basically compiled it from scratch and ran it directly from within VS
Code and then it works fine. I still need to figure out what the problem is
since I want to run it in a container.
Now the connector is up-and-running and when I try to post a CloudEvent over
the HTTP channel, I'm getting a NPE. This is what I see in the event mesh
server:
`023-12-05 10:56:10,417 INFO [eventMesh-tcp-worker-3]
message(AbstractTCPServer.java:358) -
pkg|c2eventMesh|cmd=HEARTBEAT_REQUEST|pkg=org.apache.eventmesh.common.protocol.tcp.Package@200eb500|user=UserAgent{env='PRD',
subsystem='5034', group='slackSink', path='/', pid=44210, host='127.0.0.1',
port=52420, version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,418 DEBUG [eventMesh-tcp-worker-3] Codec(Codec.java:62)
- Encoder
pkg={"header":{"cmd":"HEARTBEAT_RESPONSE","code":0,"desc":"success","seq":"6452883240","properties":{},"command":"HEARTBEAT_RESPONSE"}}
2023-12-05 10:56:10,419 INFO [eventMesh-tcp-worker-3]
message(Utils.java:126) -
pkg|eventMesh2c|cmd=HEARTBEAT_RESPONSE|pkg=org.apache.eventmesh.common.protocol.tcp.Package@2473a30e|user=UserAgent{env='PRD',
subsystem='5034', group='slackSink', path='/', pid=44210, host='127.0.0.1',
port=52420, version='2.0', idc='FT', purpose='sub', unack='0'}|wait=1ms|cost=3ms
2023-12-05 10:56:10,426 WARN [eventMesh-tcpNettyNio-Boss-1]
ServerBootstrap(AbstractBootstrap.java:464) - Unknown channel option
'SO_TIMEOUT' for channel '[id: 0xff22decc, L:/127.0.0.1:10000 -
R:/127.0.0.1:52462]'
2023-12-05 10:56:10,429 INFO [eventMesh-tcp-worker-6]
AbstractTCPServer(AbstractTCPServer.java:407) -
client|tcp|channelRegistered|remoteAddress=127.0.0.1:52462|msg=
2023-12-05 10:56:10,430 INFO [eventMesh-tcp-worker-6]
AbstractTCPServer(AbstractTCPServer.java:421) -
client|tcp|channelActive|remoteAddress=127.0.0.1:52462|msg=
2023-12-05 10:56:10,431 DEBUG [eventMesh-tcp-worker-6]
Codec(LogUtils.java:90) - Decode
headerJson={"cmd":"HELLO_REQUEST","code":0,"seq":"8583524761","properties":{},"command":"HELLO_REQUEST"}
2023-12-05 10:56:10,432 DEBUG [eventMesh-tcp-worker-6]
Codec(LogUtils.java:90) - Decode
bodyJson={"env":"PRD","subsystem":"5034","path":"/","pid":44210,"host":"localhost","port":8362,"version":"2.0","username":"slackSinkUser","password":"slackPassWord","idc":"FT","group":"slackSink","purpose":"sub","unack":0}
2023-12-05 10:56:10,433 INFO [eventMesh-tcp-worker-6]
message(LogUtils.java:130) -
pkg|c2eventMesh|cmd=HELLO_REQUEST|pkg=org.apache.eventmesh.common.protocol.tcp.Package@3308da4f
2023-12-05 10:56:10,433 INFO [eventMesh-tcp-task-handle-5]
ClientSessionGroupMapping(ClientSessionGroupMapping.java:112) - createSession
client[127.0.0.1:52462]
2023-12-05 10:56:10,434 INFO [eventMesh-tcp-task-handle-5]
sessionLogger(ClientSessionGroupMapping.java:116) -
session|open|succeed|user=UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=44210, host='127.0.0.1', port=52462,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,435 DEBUG [eventMesh-tcp-worker-6] Codec(Codec.java:62)
- Encoder
pkg={"header":{"cmd":"HELLO_RESPONSE","code":0,"desc":"success","seq":"8583524761","properties":{},"command":"HELLO_RESPONSE"}}
2023-12-05 10:56:10,436 INFO [eventMesh-tcp-worker-6]
message(Utils.java:126) -
pkg|eventMesh2c|cmd=HELLO_RESPONSE|pkg=org.apache.eventmesh.common.protocol.tcp.Package@5e1bc345|user=UserAgent{env='PRD',
subsystem='5034', group='slackSink', path='/', pid=44210, host='127.0.0.1',
port=52462, version='2.0', idc='FT', purpose='sub', unack='0'}|wait=1ms|cost=4ms
2023-12-05 10:56:10,440 DEBUG [eventMesh-tcp-worker-6]
Codec(LogUtils.java:90) - Decode
headerJson={"cmd":"SUBSCRIBE_REQUEST","code":0,"seq":"4607853627","properties":{},"command":"SUBSCRIBE_REQUEST"}
2023-12-05 10:56:10,442 DEBUG [eventMesh-tcp-worker-6]
Codec(LogUtils.java:90) - Decode
bodyJson={"topicList":[{"topic":"TEST-TOPIC-SLACK","mode":"CLUSTERING","type":"ASYNC"}]}
2023-12-05 10:56:10,444 INFO [eventMesh-tcp-worker-6]
message(AbstractTCPServer.java:358) -
pkg|c2eventMesh|cmd=SUBSCRIBE_REQUEST|pkg=org.apache.eventmesh.common.protocol.tcp.Package@2f10e82|user=UserAgent{env='PRD',
subsystem='5034', group='slackSink', path='/', pid=44210, host='127.0.0.1',
port=52462, version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,446 DEBUG [StandaloneConsumerThread-9]
Subscribe(Subscribe.java:55) - execute subscribe task, topic: TEST-TOPIC-SLACK,
offset: null
2023-12-05 10:56:10,446 INFO [eventMesh-tcp-task-handle-6]
ClientGroupWrapper(LogUtils.java:136) - Cache session success, group:slackSink
topic:TEST-TOPIC-SLACK client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=44210, host='127.0.0.1', port=52462,
version='2.0', idc='FT', purpose='sub', unack='0'}
sessionId:13ca23fc-7a5b-44bc-9cec-eb8d8b171c98
2023-12-05 10:56:10,446 INFO [eventMesh-tcp-task-handle-6]
subscribeLogger(Session.java:140) -
subscribe|succeed|topic=TEST-TOPIC-SLACK|user=UserAgent{env='PRD',
subsystem='5034', group='slackSink', path='/', pid=44210, host='127.0.0.1',
port=52462, version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,446 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=44210, host='127.0.0.1', port=52443,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,446 INFO [eventMesh-tcp-task-handle-6]
SubscribeProcessor(LogUtils.java:130) - SubscribeTask
succeed|user=UserAgent{env='PRD', subsystem='5034', group='slackSink',
path='/', pid=44210, host='127.0.0.1', port=52462, version='2.0', idc='FT',
purpose='sub', unack='0'}|topics=[SubscriptionItem{topic=TEST-TOPIC-SLACK,
mode=CLUSTERING, type=ASYNC}]
2023-12-05 10:56:10,446 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=44210, host='127.0.0.1', port=52421,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,446 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=43436, host='127.0.0.1', port=52339,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,446 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=43642, host='127.0.0.1', port=52392,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,446 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=43436, host='127.0.0.1', port=52318,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,447 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=43436, host='127.0.0.1', port=52304,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,447 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=43436, host='127.0.0.1', port=52266,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,447 WARN [StandaloneConsumerThread-9]
Session(Session.java:282) - session is not available because session has been
closed,topic:TEST-TOPIC-SLACK,client:UserAgent{env='PRD', subsystem='5034',
group='slackSink', path='/', pid=43642, host='127.0.0.1', port=52374,
version='2.0', idc='FT', purpose='sub', unack='0'}
2023-12-05 10:56:10,447 WARN [StandaloneConsumerThread-9]
FreePriorityDispatchStrategy(LogUtils.java:152) - all sessions can't downstream
msg
2023-12-05 10:56:10,447 DEBUG [eventMesh-tcp-worker-6] Codec(Codec.java:62)
- Encoder
pkg={"header":{"cmd":"SUBSCRIBE_RESPONSE","code":0,"desc":"success","seq":"4607853627","properties":{},"command":"SUBSCRIBE_RESPONSE"}}
2023-12-05 10:56:10,447 WARN [StandaloneConsumerThread-9]
ClientGroupWrapper(ClientGroupWrapper.java:515) - handle msg exception when no
session found
java.lang.NullPointerException: null
at java.util.Objects.requireNonNull(Objects.java:203) ~[?:1.8.0_392]
at
org.apache.eventmesh.runtime.core.protocol.tcp.client.group.ClientGroupWrapper.lambda$initClientGroupPersistentConsumer$0(ClientGroupWrapper.java:483)
~[eventmesh-runtime-1.9.0-release.jar:1.9.0-release]
at
org.apache.eventmesh.storage.standalone.broker.task.Subscribe.subscribe(Subscribe.java:90)
[eventmesh-storage-standalone-1.9.0-release.jar:1.9.0-release]
at
org.apache.eventmesh.storage.standalone.broker.task.SubscribeTask.run(SubscribeTask.java:38)
[eventmesh-storage-standalone-1.9.0-release.jar:1.9.0-release]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[?:1.8.0_392]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[?:1.8.0_392]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_392]`
And this is the NPE I'm getting in the connector:
`2023-12-05 11:17:41,183 INFO [main] Application(Application.java:99) -
connector slackSink started
2023-12-05 11:17:42,151 DEBUG [nioEventLoopGroup-3-1]
Codec(LogUtils.java:90) - Decode
headerJson={"cmd":"ASYNC_MESSAGE_TO_CLIENT","code":1,"desc":"[org.apache.eventmesh.runtime.core.protocol.tcp.client.session.push.SessionPusher.push(SessionPusher.java:105),
org.apache.eventmesh.runtime.core.protocol.tcp.client.session.Session.downstreamMsg(Session.java:168),
org.apache.eventmesh.runtime.core.protocol.tcp.client.group.ClientGroupWrapper.lambda$initClientGroupPersistentConsumer$0(ClientGroupWrapper.java:529),
org.apache.eventmesh.storage.standalone.broker.task.Subscribe.subscribe(Subscribe.java:90),
org.apache.eventmesh.storage.standalone.broker.task.SubscribeTask.run(SubscribeTask.java:38),
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149),
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624),
java.lang.Thread.run(Thread.java:750)]","seq":"10","properties":{},"command":"ASYNC_MESSAGE_TO_CLIENT"}
2023-12-05 11:17:42,157 INFO [nioEventLoopGroup-3-1]
AbstractEventMeshTCPSubHandler(AbstractEventMeshTCPSubHandler.java:48) -
|receive|type=ASYNC_MESSAGE_TO_CLIENT|msg=org.apache.eventmesh.common.protocol.tcp.Package@2c16930b
2023-12-05 11:17:42,251 INFO [nioEventLoopGroup-3-1]
TcpClient(LogUtils.java:130) - exceptionCaught, close connection.|remote
address=/127.0.0.1:10000
java.lang.NullPointerException: null
at
org.apache.eventmesh.client.tcp.impl.cloudevent.CloudEventTCPSubClient$CloudEventTCPSubHandler.getProtocolMessage(CloudEventTCPSubClient.java:158)
~[main/:?]
at
org.apache.eventmesh.client.tcp.impl.cloudevent.CloudEventTCPSubClient$CloudEventTCPSubHandler.getProtocolMessage(CloudEventTCPSubClient.java:1)
~[main/:?]
at
org.apache.eventmesh.client.tcp.impl.AbstractEventMeshTCPSubHandler.channelRead0(AbstractEventMeshTCPSubHandler.java:55)
~[main/:?]
at
org.apache.eventmesh.client.tcp.impl.AbstractEventMeshTCPSubHandler.channelRead0(AbstractEventMeshTCPSubHandler.java:1)
~[main/:?]
at
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
~[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327)
[netty-codec-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299)
[netty-codec-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
[netty-transport-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
[netty-common-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
[netty-common-4.1.79.Final.jar:4.1.79.Final]
at
io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
[netty-common-4.1.79.Final.jar:4.1.79.Final]
at java.lang.Thread.run(Thread.java:750) [?:1.8.0_392]`
I assume it's related to the message I'm posting but still trying to figure
out exactly what is wrong.
Bram
ps: sorry guys I know I'm asking/trying a lot but I'm trying to setup a
simple POC :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]