Hi,Team:
Is there any approach to get flink sql runtime via api ?
Any help would be appreciated.
Hi Jose,
Sorry, my previous response may have been misleading.
I have confirmed here that Flink 1.15 only supports Hadoop 2.8.5 and above,
so you should use a Hadoop version of 2.8.5 or higher.
https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15/#upgrade-the-minima
Hi,
I am stumbling on the next Flink SQL problem - but I am sure you can help
me :)
I have an extremely simple table called "bla" which just has one column of
type double. Now I want to sink that table into a Kafka topic. This is how
I do it:
CREATE TABLE bla_sink (
total DOUBLE,
PRIMARY KEY (t
Thanks for the above reply.
To the repesct, I comment the following: in tests performed with
flink-shaded-hadoop-2-uber-2.8.3-10.0.jar, Hadoop 2.8.3 and Flink 1.10.0
everything goes fine. But, when I upgrade Flink to 1.15.0. I encounter
errors. According to the documentation, Flink 1.15.0 only wor
Hi Ralf,
Have you tried _only_ adding flink-sql-connector-kafka-1.17.1.jar to the
lib directory of your Flink deployment? It is a fat JAR, with all the
dependencies shaded into it, so when you add them separately again, they
will clash.
Best
Am Mi., 11. Okt. 2023 um 04:26 Uhr schrieb Ralph Matth
Hi Flink users and developers,
Currently, Flink won't generate doc for the deprecated options. This might
confuse users when upgrading from an older version of Flink: they have to
either carefully read the release notes or check the source code for upgrade
guidance on deprecated options.
I pro
Thanks a lot for your response Mason.
Is there any FLIP planned to expose context in Reader in future?
Regards,
Prateek Kohli
On Wed, Oct 11, 2023 at 6:03 AM Mason Chen wrote:
> Hi Prateek,
>
> I agree, the reader should ideally expose the context to record metrics
> about deserialization. One
Hi Team,
Do we have any plans to update flink to support Curl 8.4.0 with earlier
versions having severe vulnerabilities?
Thanks & Regards,
Ankur Singhal