Re: create savepoint on bounded source in streaming mode

2022-01-26 Thread Shawn Du
DataStream API -- Sender:Guowei Ma Sent At:2022 Jan. 26 (Wed.) 21:51 Recipient:Shawn Du Cc:user Subject:Re: create savepoint on bounded source in streaming mode Hi, Shawn Thank you for your sharing. Unfortunately I do

Re: create savepoint on bounded source in streaming mode

2022-01-26 Thread Shawn Du
right! -- Sender:Guowei Ma Sent At:2022 Jan. 26 (Wed.) 19:50 Recipient:Shawn Du Cc:user Subject:Re: create savepoint on bounded source in streaming mode Hi,Shawn You want to use the correct state(n-1) for day n-1

Re: create savepoint on bounded source in streaming mode

2022-01-26 Thread Shawn Du
but with state. we want to use the same code for replaying. thus we need persist the state for next job. any ideas? Thanks Shawn -- Sender:Guowei Ma Sent At:2022 Jan. 26 (Wed.) 15:39 Recipient:Shawn Du Cc:user

Re: create savepoint on bounded source in streaming mode

2022-01-26 Thread Shawn Du
cs-release-1.14/docs/libs/state_processor_api/ Best, Guowei On Wed, Jan 26, 2022 at 2:17 PM Shawn Du wrote: our application is stateful. processing live events depends on the state. but for kinds of reason, we need rebuild the state. it will be very costly to replay all data.

Re: create savepoint on bounded source in streaming mode

2022-01-25 Thread Shawn Du
rebuild the state from a point. we call this as a bootstrap process. any ideas? Thanks. -- Sender:Guowei Ma Sent At:2022 Jan. 26 (Wed.) 14:04 Recipient:Shawn Du Cc:user Subject:Re: create savepoint on bounded source

create savepoint on bounded source in streaming mode

2022-01-25 Thread Shawn Du
Hi experts, assume I have several files and I want replay these files in order in streaming mode and create a savepoint when files play at the end. it is possible? I wrote a simple test app, and job are finished when source is at the end. I have no chance to creat a savepoint. please help.

Flink shutdown with exception when run in idea IDE

2022-01-24 Thread Shawn Du
Hi experts, I am new to flink, just run a simple job in IDE, but there are many exceptions thrown when job finished(see blow). job source is bounded, read from a local file and run in streaming mode. there is a customer sink also, simply write to local file. It seems that each time I run, I got

Re: Flink Table API schema doesn't support nested object in ObjectArray

2019-08-07 Thread Jacky Du
thanks Fabian , I created a Jira ticket with a code sample . https://issues.apache.org/jira/projects/FLINK/issues/FLINK-13603?filter=allopenissues I think if the root cause I found is correct, fix this issue could be pretty simple . Thanks Jacky Du Fabian Hueske 于2019年8月2日周五 下午12:07写道

Flink Table API schema doesn't support nested object in ObjectArray

2019-08-02 Thread Jacky Du
below one is working : payload : Row(arraylist : ObjectArrayTypeInfo) This issue happens at 1.6.x , 1.7.x and 1.8.x , but working at 1.5.x . Thanks Jacky Du

issue on Flink 1.6.2 for Table API when table schema contains nested ObjectArray

2019-07-30 Thread Jacky Du
hi, I Have column not found exception when issue running a simple FLink query against Flink Table API on Flink 1.6.2. exception log: Caused by: org.apache.calcite.sql.validate.SqlValidatorException: Column 'data.interaction.action_type' not found in table 'mycable' When I changed Flink version

Re: Hive Integration

2018-08-13 Thread Will Du
This is a missing piece from Flink. Link now does not provide spark sql like integration to Hive. However, you can write ORC/Avro format which can read by Hive later. Thanks, Will > On Aug 13, 2018, at 7:34 PM, Renjie Liu wrote: > > Hi, yuvraj: > > Do you mean querying hive with sql? Or

Re: flink 1.5 Rest API issues

2018-07-12 Thread Will Du
t; > Best, > Gary > > [1] https://developer.mozilla.org/en-US/docs/Tools/Network_Monitor > >> On Wed, Jul 11, 2018 at 5:39 PM, Will Du wrote: >> Hi folks, >> I have program working in FLink 1.3 and tested working in v1.5.0 command >> line too. However, it h

Re: flink JPS result changes

2018-07-11 Thread Will Du
page.action?pageId=65147077> > https://docs.google.com/document/d/1zwBP3LKnJ0LI1R4-9hLXBVjOXAkQS5jKrGYYoPz-xRk/edit#heading=h.giuwq6q8d23j > > <https://docs.google.com/document/d/1zwBP3LKnJ0LI1R4-9hLXBVjOXAkQS5jKrGYYoPz-xRk/edit#heading=h.giuwq6q8d23j> > > > > On We

flink JPS result changes

2018-07-11 Thread Will Du
Hi folks Do we have any information about the process changes after v1.5.0? I used to see jobManager and TaskManager process once the start-cluster.sh is being called. But, it shows below in v1.5.0 once started. Everything works, but no idea where is the jobManager. $jps 2523 TaskManagerRunner

Kafka Avro Table Source

2018-07-02 Thread Will Du
Hi folks, I am working on using avro table source mapping to kafka source. By looking at the example, I think the current Flink v1.5.0 connector is not flexible enough. I wonder if I have to specify the avro record class to read from Kafka. For withSchema, the schema can get from schema

Re: [DISCUSS] Flink 1.6 features

2018-06-17 Thread Will Du
Agree, two missing pieces I think could make Flink more competitive against Spark SQL/Stream and Kafka Stream 1. Flink over Hive or Flink SQL hive table source and sink 2. Flink ML on stream > On Jun 17, 2018, at 8:34 AM, zhangminglei <18717838...@163.com> wrote: > > Actually, I have been an

Re: Ask for SQL using kafka in Flink

2018-06-04 Thread Will Du
Yes, I am also looking for examples for Kafka avro table examples in java and command line. Also, Kafka avro table sink is still missing. In addition, once we have Kafka topic, the API should read the schema directly from schema file or schema registry. The way of current API supporting lacks

Re: hadoop

2017-08-16 Thread Will Du
Is the kerberos token expired without renewing? > On Aug 16, 2017, at 7:48 PM, Raja.Aravapalli > wrote: > > > Hi, > > I triggered an flink yarn-session on a running Hadoop cluster… and triggering > streaming application on that. > > But, I see after few days

Re: Flink monitor rest API question

2017-07-19 Thread Will Du
gt; > I will create a JIRA for this. > > Regards, > Chesnay > >> On 19.07.2017 13:31, Will Du wrote: >> Hi folks, >> I am using a java rest client - unirest lib to GET from flink rest API to >> get a Job status. I got exception-unsupported content encodin

Flink monitor rest API question

2017-07-19 Thread Will Du
Hi folks, I am using a java rest client - unirest lib to GET from flink rest API to get a Job status. I got exception-unsupported content encoding -UTF8. Do you guys known how to resolve it? I use postman client working fine. Thanks, Will

Link read avro from Kafka Connect Issue

2016-11-02 Thread Will Du
Hi folks, I am trying to consume avro data from Kafka in Flink. The data is produced by Kafka connect using AvroConverter. I have created a AvroDeserializationSchema.java used by Flink consumer. Then, I use following code to