Re:Re:Re:sink jdbc超时问题(无心跳包)

2021-12-19 Thread Michael Ran
心跳没有,只有重试参数:sink.max-retries 在 2021-12-20 12:14:37,"Jeff" 写道: > > > >会有? 什么意思呢? 我现在用的是1.13.2没有呀,相关配置也没。 > > > > > > > > > > > >在 2021-12-20 10:43:05,"Michael Ran" 写道: >>会有判断连接是否有效,以及重试的操作 >>在 2021-12-20 11:39:23,"Jeff" 写道: >>>sink jdbc长时间未有数据写入后,再写数据时会出现连接超时问题,请问flink jdbc

Re:Re:sink jdbc超时问题(无心跳包)

2021-12-19 Thread Jeff
会有? 什么意思呢? 我现在用的是1.13.2没有呀,相关配置也没。 在 2021-12-20 10:43:05,"Michael Ran" 写道: >会有判断连接是否有效,以及重试的操作 >在 2021-12-20 11:39:23,"Jeff" 写道: >>sink jdbc长时间未有数据写入后,再写数据时会出现连接超时问题,请问flink jdbc connector是否有像连接池那样的心跳功能?

Re:sink jdbc超时问题(无心跳包)

2021-12-19 Thread Michael Ran
会有判断连接是否有效,以及重试的操作 在 2021-12-20 11:39:23,"Jeff" 写道: >sink jdbc长时间未有数据写入后,再写数据时会出现连接超时问题,请问flink jdbc connector是否有像连接池那样的心跳功能?

sink jdbc超时问题(无心跳包)

2021-12-19 Thread Jeff
sink jdbc长时间未有数据写入后,再写数据时会出现连接超时问题,请问flink jdbc connector是否有像连接池那样的心跳功能?

Re:flink on native k8s模式下CPU使用率不高问题

2021-12-19 Thread Jeff
升级版本没有用的,我用的是flink 1.13.2也遇到这个问题,原因是它request与limit相同,所以后来我改了它的源代码,你可以参考一下:https://github.com/jeff-zou/flink.git ,我主要是改了KubernetesUtils.java这个类,利用external resource传入参数来替换request 在 2021-12-18 09:15:06,"casel.chen" 写道: >所用flink版本是1.12.5,部署作业到native k8s设置的不管是

Re: kafka源码执行测试用例问题

2021-12-19 Thread Hang Ruan
应该先用mvn install 发布到本地仓库,这个依赖才可以被找到 Yuepeng Pan 于2021年12月17日周五 20:28写道: > Hi, Chen. > 如果是idea模式,可以尝试排查下pom中依赖项的scope。 > > > > Best, > Roc. > > > > > > 在 2021-12-17 17:41:32,"陈卓宇" <2572805...@qq.com.INVALID> 写道: > >您好社区: > > > >我在进行flink源码Kafka连接器部分进行测试用例运行 > > > >报错日志: > > > >[ERROR] >

Re: Kryo EOFException: No more bytes left

2021-12-19 Thread Dan Hill
I'll retry the job to see if it's reproducible. The serialized state is bad so that run keeps failing. On Sun, Dec 19, 2021 at 4:28 PM Zhipeng Zhang wrote: > Hi Dan, > > Could you provide the code snippet such that we can reproduce the bug here? > > Dan Hill 于2021年12月20日周一 07:18写道: > >> Hi. >>

Re: Kryo EOFException: No more bytes left

2021-12-19 Thread Zhipeng Zhang
Hi Dan, Could you provide the code snippet such that we can reproduce the bug here? Dan Hill 于2021年12月20日周一 07:18写道: > Hi. > > I was curious if anyone else has hit this exception. I'm using the > IntervalJoinOperator to two streams of protos. I registered the protos > with a kryo serializer.

Kryo EOFException: No more bytes left

2021-12-19 Thread Dan Hill
Hi. I was curious if anyone else has hit this exception. I'm using the IntervalJoinOperator to two streams of protos. I registered the protos with a kryo serializer. I started hitting this issue which looks like the operator is trying to deserialize a bad set of bytes that it serialized. I'm

Re: question on jar compatibility - log4j related

2021-12-19 Thread David Morávek
Hi Eddie, the APIs should be binary compatible across patch releases, so there is no need to re-compile your artifacts Best, D. On Sun 19. 12. 2021 at 16:42, Colletta, Edward wrote: > If have jar files built using flink version 11.2 in dependencies, and I > upgrade my cluster to 11.6, is it

question on jar compatibility - log4j related

2021-12-19 Thread Colletta, Edward
If have jar files built using flink version 11.2 in dependencies, and I upgrade my cluster to 11.6, is it safe to run the existing jars on the upgraded cluster or should I rebuild all jobs against 11.6? Thanks, Eddie Colletta

Re: How do I determine which hardware device and software has log4j zero-day security vulnerability?

2021-12-19 Thread Turritopsis Dohrnii Teo En Ming
I realised there is an Apache Log4j mailing list. Regards, Mr. Turritopsis Dohrnii Teo En Ming Targeted Individual in Singapore 19 Dec 2021 Sunday On Fri, 17 Dec 2021 at 00:29, Arvid Heise wrote: > > I think this is meant for the Apache log4j mailing list [1]. > > [1]

Re: How do I determine which hardware device and software has log4j zero-day security vulnerability?

2021-12-19 Thread Turritopsis Dohrnii Teo En Ming
Hi, Please refer to this link. Article: Log4j zero-day flaw: What you need to know and how to protect yourself Link: https://www.zdnet.com/article/log4j-zero-day-flaw-what-you-need-to-know-and-how-to-protect-yourself/ The article says: [QUOTE] WHAT DEVICES AND APPLICATIONS ARE AT RISK?

Re: Alternatives of KafkaDeserializationSchema.isEndOfStream()

2021-12-19 Thread Arvid Heise
Hi Dong, I see your point. The main issue with dynamic EOF is that we can't run in batch mode. That may be desired in the case of Ayush but there may be other use cases where it's not. Additionally, it's quite a bit of code if you'd implement a KafkaRecordDeserializationSchema from scratch. There