Re: use flink 1.19 JDBC Driver can find jdbc connector

2024-05-13 文章 kellygeorg...@163.com
退订 Replied Message | From | abc15...@163.com | | Date | 05/10/2024 12:26 | | To | user-zh@flink.apache.org | | Cc | | | Subject | Re: use flink 1.19 JDBC Driver can find jdbc connector | I've solved it. You need to register the number of connections in the jar of gateway. But this is

flink集群如何将日志直接写入elasticsearch中?

2024-03-13 文章 kellygeorg...@163.com
有没有比较方便快捷的解决方案?

回复:flink operator 高可用任务偶发性报错unable to update ConfigMapLock

2024-03-11 文章 kellygeorg...@163.com
有没有高手指点一二???在线等 回复的原邮件 | 发件人 | kellygeorg...@163.com | | 日期 | 2024年03月11日 20:29 | | 收件人 | user-zh | | 抄送至 | | | 主题 | flink operator 高可用任务偶发性报错unable to update ConfigMapLock | jobmanager的报错如下所示,请问是什么原因? Exception occurred while renewing lock:Unable to update ConfigMapLock Caused

flink operator 高可用任务偶发性报错unable to update ConfigMapLock

2024-03-11 文章 kellygeorg...@163.com
jobmanager的报错如下所示,请问是什么原因? Exception occurred while renewing lock:Unable to update ConfigMapLock Caused by:io.fabric8.kubernetes.client.Kubernetes Client Exception:Operation:[replace] for kind:[ConfigMap] with name:[flink task xx- configmap] in namespace:[default] Caused by: