Difference between Session Mode and Session Job(Flink Opearator)

2022-07-06 Thread bat man
Hi, I want to understand the difference between session mode and the new deployment mode - Flink Session Job which I believe is newly introduced as part of the Flink Operator(1.15) release. What's the benefit of using this mode as opposed to session mode as both run sessions to which flink jobs

Re: Flink interval join 水印疑问

2022-07-06 Thread yidan zhao
timestamp是为每个element(输入的记录)赋值的一个时间戳。 watermark是从source部分生成的水印个,然后向后传播。 以分窗口为例,数据记录的timestamp用于决定数据划分入哪个窗口。 watermark用于决定窗口啥时候闭合,比如窗口是0-5s,那么当watermark达到5s的时候,窗口就会闭合。 考虑数据不一定能及时到达,可以让watermark=max(timestamp)-30s。30s即可容忍给的数据乱序的程度。 lxk 于2022年7月6日周三 13:36写道: > > 在使用interval

Re: Is Flink able to read a CSV file or just like in Blink this function does not work?

2022-07-06 Thread Alexander Fedulov
Hi Mike, I do not see any issues with your code. With a sample csv file like this a,1.0 b,2.0 c,3.0 it produces the expected result +--+ | some_value | +--+ |3 | +--+ 1 row in set Process finished with

Is Flink able to read a CSV file or just like in Blink this function does not work?

2022-07-06 Thread podunk
If I'm reading Flink manul correctly (and this is not simple - no examples), this code should read CSV file:     package flinkTest2; import org.apache.flink.table.api.EnvironmentSettings; import org.apache.flink.table.api.TableEnvironment;   public class flinkTest2 {     public static

Re: Restoring a job from a savepoint

2022-07-06 Thread Alexander Fedulov
Hi John, use *$ bin/flink run -s s3://my_bucket/path/to/savepoints/ *(no trailing slash, including schema). where should contain a valid _metadata file. You should see logs like this: *INFO o.a.f.r.c.CheckpointCoordinator [] - Starting job foobar from savepoint

Re: Restoring a job from a savepoint

2022-07-06 Thread Yaroslav Tkachenko
Hi John, I've been using a path like this: s3:savepoint- (no trailing slash). I'm pretty sure you need to specify the full path. Yes, you can see savepoint restore in logs. It's also fairly easy to see it in the Flink UI, under the Checkpoints section (it shows the information about the

Restoring a job from a savepoint

2022-07-06 Thread John Tipper
Hi all, The docs on restoring a job from a savepoint (https://nightlies.apache.org/flink/flink-docs-master/docs/ops/state/savepoints/#resuming-from-savepoints) state that the syntax is: $ bin/flink run -s :savepointPath [:runArgs] and where "you may give a path to either the savepoint’s

es连接器支持lookup

2022-07-06 Thread 李宇彬
各位大佬, 看了下现有的es连接器只支持sink,请问有支持lookup的计划吗,还是说已经有jira了?

Re:flink sql解析kafka数据

2022-07-06 Thread RS
Hi, 你可以在定义表ccc_test_20220630_2字段的时候,结构如果固定,可以指定字段类型为ARRAY+ROW吧,例如 abc ARRAY>,如果里面是动态结构,可以定义为MAP 结构如果比较复杂,或者字段不明确,就自定义UDF解决。 Thanks 在 2022-06-30 15:02:55,"小昌同学" 写道: 各位大佬 请教一下就是我kafka的数据是这样的嵌套格式 ARRAY里面嵌套了ROW类型 我这边想直接通过flink sql建表语句拿到最里面的字段的值 我百度找到了

Re: How to mock new DataSource/Sink

2022-07-06 Thread David Jost
> On 5. Jul 2022, at 01:48, Alexander Fedulov wrote: > > Hi David, > > I started working on FLIP-238 exactly with the concerns you've mentioned in > mind. It is currently in development, feel free to join the discussion [1]. > If you need something ASAP and are not interested in rate-limiting

Re:Re:Re:what time support java17 ?

2022-07-06 Thread jiangjiguang719
hi xuyang, Thanks, but FLINK-15736 has not progess. Good news, After two days of hard work, I have upgrade to java17, and now i am happy to code. 在 2022-07-05 10:01:05,"Xuyang" 写道: >Hi,社区已经有一个issue[1]在尝试推进支持java17了,可以关注下。[1] >https://issues.apache.org/jira/browse/FLINK-15736

RE: how to connect to the flink-state store and use it as cache to serve APIs.

2022-07-06 Thread Schwalbe Matthias
Hi Laxmi, Did you consider Apache Flink Table Store [1] which was introduced short time ago. Yours sounds like a case for early integration … Sincere greetings Thias [1] https://nightlies.apache.org/flink/flink-table-store-docs-release-0.1/ From: laxmi narayan Sent: Wednesday, July 6, 2022

Re:请教下flink的提交方式

2022-07-06 Thread RS
Hi, 通过命令行的方式提交,可以捕获flink run的标准输出,里面包含job id,然后正则匹配或者字符串截取就可以提取到job id了 Thanks 在 2022-07-04 17:50:07,"sherlock zw" 写道: >目前我需要去监控已经提交的flink任务, >但是通过命令行方式提交的话拿不到任务id,只能通过INFO级别的日志过滤出来,但是我们的环境里面的日志界别是WARN,看不到任务id的日志输出,所以想问下除了命令行的方式提交任务还有其他方式吗,例如有和Spark类似的SparkLaunch一样的jar提交的方式吗?希望大佬指点下,谢谢。