Re: Flink Elastic Sink

2020-05-30 Thread Leonard Xu
Hi, aj > I was confused before as I was thinking the sink builder is called only once > but it gets called for every batch request, correct me if my understanding is > wrong. You’re right that sink builder should be called only once rather than every batch requests, could you post some code

Flink s3 streaming performance

2020-05-30 Thread venkata sateesh` kolluru
Hello, I have posted the same in stackoverflow but didnt get any response. So posting it here for help. https://stackoverflow.com/questions/62068787/flink-s3-write-performance-optimization?noredirect=1#comment109814428_62068787 Details: I am working on a flink application on kubernetes(eks)

Re: Auto adjusting watermarks?

2020-05-30 Thread Theo Diefenthal
Hi Congxian, Thank's for your feedback. You raised a point I also already thought about. As "assignTimestampsAndWatermarks" creates an operator extending the standard AbstractUdfStreamOperator, I can also implement a RichFunction watermark assigner with full state access. In my case, I was

Re: Data Stream Enrichement

2020-05-30 Thread Lasse Nedergaard
Hi. If you can cache the data i state it’s the preferred way. Then you read all you values from a store do a key by and store them in state in a coprocessfunction. If you need to do a lookup for every row you have to use the AsyncIO function Med venlig hilsen / Best regards Lasse Nedergaard

Re: Data Stream Enrichement

2020-05-30 Thread Taher Koitawala
The Open method would be a great! And close method could close it when operator closes! Also for external calls AsyncIO is a great operator. Give that a look. Regards, Taher Koitawala On Sat, May 30, 2020, 10:17 PM Aissa Elaffani wrote: > Hello Guys, > I want to enrich a data stream with

Data Stream Enrichement

2020-05-30 Thread Aissa Elaffani
Hello Guys, I want to enrich a data stream with some mongoDB data, and I am willing to use the RichFlatMapFunction, and I am lost , i don't know where to configure the connection with my MongoDB. Can anyone Help me in this ? Best, Aissa

State expiration in Flink

2020-05-30 Thread Vasily Melnik
Hi . I'm a bit confused with this point in State TTL documentation: " By default, expired values are explicitly removed on read, such as ValueState#value, and periodically garbage collected in the background if supported by the configured state backend. " Does it mean, that if i have only one

Re: Re: Re: Flink Window with multiple trigger condition

2020-05-30 Thread aj
Thanks Yun. I have converted the code to use a keyed-processed function rather than a flatMap and using register timer it worked. On Fri, May 29, 2020 at 11:13 AM Yun Gao wrote: > Hi, > > I think you could use *timer* to achieve that. In *processFunction* > you could register a timer at

Re: Flink Elastic Sink

2020-05-30 Thread aj
Thanks, It worked. I was confused before as I was thinking the sink builder is called only once but it gets called for every batch request, correct me if my understanding is wrong. On Fri, May 29, 2020 at 9:08 AM Leonard Xu wrote: > Hi,aj > > In the implementation of ElasticsearchSink,

Re:re:提交flink代码 无法创建taskmanager flink web界面一致停留在created状态 ,日志报错如下

2020-05-30 Thread 程龙
是用代码提交的jobmanager 是可以加载的 就是启动taskmanager 这个目录就没有创建, 界面如下 ,错误日志就是我下面贴出来的那个 在 2020-05-30 19:16:57,"462329521" <462329...@qq.com> 写道: >你的提交命令是什么呢看样子是加载不到配置文件 > > >-- 原始邮件 -- >发件人: "程龙"<13162790...@163.com; >发件时间: 2020-05-30 19:13 >收件人: "user-zh"主题:

re:提交flink代码 无法创建taskmanager flink web界面一致停留在created状态 ,日志报错如下

2020-05-30 Thread 462329521
你的提交命令是什么呢看样子是加载不到配置文件 -- 原始邮件 -- 发件人: "程龙"<13162790...@163.com; 发件时间: 2020-05-30 19:13 收件人: "user-zh"

提交flink代码 无法创建taskmanager flink web界面一致停留在created状态 ,日志报错如下

2020-05-30 Thread 程龙
2020-05-30 19:07:31,418 INFO org.apache.flink.yarn.YarnTaskExecutorRunner - 2020-05-30 19:07:31,418 INFO org.apache.flink.yarn.YarnTaskExecutorRunner - Registered UNIX signal

回复: Re: java.lang.AbstractMethodError when implementing KafkaSerializationSchema

2020-05-30 Thread wangl...@geekplus.com.cn
It is because the jar conflict and i have fixed it. I put flink-connector-kafka_2.11-1.10.0.jar in the flink lib directory. Also in my project pom file has the dependency flink-connector-kafka and builded as a fat jar Thanks, Lei wangl...@geekplus.com.cn 发件人: Leonard Xu 发送时间:

Re: Sorting Bounded Streams

2020-05-30 Thread Benchao Li
Hi Satyam, You are correct. Blink planner is built on top of DataStream, both for batch and streaming. Hence you cannot transform Table into DataSet if you are using blink planner. AFAIK, the community is working on the unification of batch and streaming. And the the unification will be Table

Re: Sorting Bounded Streams

2020-05-30 Thread Satyam Shekhar
Thanks for your reply, Benchao Li. While I can use the Blink planner in batch mode, I'd still have to work with DataSet. Based on my limited reading it appears to me that DataStream is being extended to support both batch and steaming use-cases with the `isBounded` method in the StreamTableSource