[jira] [Created] (FLINK-34873) [Bug] After starting Streaming ELT from MySQL to StarRocks using Flink CDC 3.0, the newly created tables are not being synchronized.

2024-03-20 Thread Flink CDC Issue Import (Jira)
Flink CDC Issue Import created FLINK-34873: -- Summary: [Bug] After starting Streaming ELT from MySQL to StarRocks using Flink CDC 3.0, the newly created tables are not being synchronized. Key: FLINK-34873

[jira] [Created] (FLINK-34865) [enhancement] [StarRocks] When synchronizing tables using Flink CDC 3.x, is it possible to include the comments of the source table's fields when creating the target tab

2024-03-20 Thread Flink CDC Issue Import (Jira)
Flink CDC Issue Import created FLINK-34865: -- Summary: [enhancement] [StarRocks] When synchronizing tables using Flink CDC 3.x, is it possible to include the comments of the source table's fields when creating the target table

[jira] [Created] (FLINK-32418) ClassNotFoundException when using flink-protobuf with sql-client

2023-06-22 Thread Michael Kreis (Jira)
Michael Kreis created FLINK-32418: - Summary: ClassNotFoundException when using flink-protobuf with sql-client Key: FLINK-32418 URL: https://issues.apache.org/jira/browse/FLINK-32418 Project: Flink

[jira] [Created] (FLINK-30453) Fix 'can't find CatalogFactory' error when using FLINK sql-client to add table store bundle jar

2022-12-19 Thread yuzelin (Jira)
yuzelin created FLINK-30453: --- Summary: Fix 'can't find CatalogFactory' error when using FLINK sql-client to add table store bundle jar Key: FLINK-30453 URL: https://issues.apache.org/jira/browse/FLINK-30453

[jira] [Created] (FLINK-30359) Encountered NoClassDefFoundError when using flink-sql-connector-elasticsearch6

2022-12-10 Thread Weijie Guo (Jira)
Weijie Guo created FLINK-30359: -- Summary: Encountered NoClassDefFoundError when using flink-sql-connector-elasticsearch6 Key: FLINK-30359 URL: https://issues.apache.org/jira/browse/FLINK-30359 Project

[jira] [Created] (FLINK-30259) Using flink Preconditions Util instead of uncertain Assert keyword to do checking

2022-12-01 Thread Ran Tao (Jira)
Ran Tao created FLINK-30259: --- Summary: Using flink Preconditions Util instead of uncertain Assert keyword to do checking Key: FLINK-30259 URL: https://issues.apache.org/jira/browse/FLINK-30259 Project

[jira] [Created] (FLINK-28414) Using flink to query the hive table, an exception occurred, SQL validation failed. Failed to get table schema from deserializer

2022-07-05 Thread xcrossed (Jira)
xcrossed created FLINK-28414: Summary: Using flink to query the hive table, an exception occurred, SQL validation failed. Failed to get table schema from deserializer Key: FLINK-28414 URL: https://issues.apache.org

Require help regarding possible issue/bug I'm facing while using Flink

2022-03-06 Thread Chia De Xun .
Greetings, I'm facing a difficult issue/bug while working with Flink. Would definitely appreciate some official expert help on this issue. I have posted my problem on StackOverflow , but have no

[jira] [Created] (FLINK-26102) connector test by using 'flink run --python'

2022-02-13 Thread waittting (Jira)
waittting created FLINK-26102: - Summary: connector test by using 'flink run --python' Key: FLINK-26102 URL: https://issues.apache.org/jira/browse/FLINK-26102 Project: Flink Issue Type: Bug

[jira] [Created] (FLINK-25182) NoClassDefFoundError of PulsarAdminImpl by using flink-connector-pulsar:1.14 on k8s flink cluster

2021-12-06 Thread HeYe (Jira)
HeYe created FLINK-25182: Summary: NoClassDefFoundError of PulsarAdminImpl by using flink-connector-pulsar:1.14 on k8s flink cluster Key: FLINK-25182 URL: https://issues.apache.org/jira/browse/FLINK-25182

[jira] [Created] (FLINK-23655) Custom transformation name displayed on the web, when using Flink Table & SQL API, just like the name method of SingleOutputStreamOperator , or remove the SQL content di

2021-08-05 Thread liwei li (Jira)
liwei li created FLINK-23655: Summary: Custom transformation name displayed on the web, when using Flink Table & SQL API, just like the name method of SingleOutputStreamOperator , or remove the SQL content display directly.

[jira] [Created] (FLINK-23567) Hive 1.1.0 failed to write using flink sql 1.3.1 because the JSON class was not found

2021-07-30 Thread wuyang (Jira)
wuyang created FLINK-23567: -- Summary: Hive 1.1.0 failed to write using flink sql 1.3.1 because the JSON class was not found Key: FLINK-23567 URL: https://issues.apache.org/jira/browse/FLINK-23567 Project

Re: Out of Memory Error-Heap when storing parquet files using Flink Table API (Flink version-1.12.0) in Google Cloud Storage

2021-03-25 Thread Xintong Song
Song On Fri, Mar 26, 2021 at 6:48 AM Sivaraman Venkataraman, Aswin Ram < aswin.ram.sivaraman.venkatara...@sap.com> wrote: > Hi Everyone, > Hope you are doing well. We are currently using Flink Table API (Flink > Version-1.12.0) to stream data from Kafka and store it in Google

Re: Out of Memory Error-Heap when storing parquet files using Flink Table API (Flink version-1.12.0) in Google Cloud Storage

2021-03-25 Thread Sivaraman Venkataraman, Aswin Ram
Hi Everyone, Hope you are doing well. We are currently using Flink Table API (Flink Version-1.12.0) to stream data from Kafka and store it in Google Cloud Storage. The file format we are using to store data is Parquet. Initially the Flink job worked perfectly fine and we were able to stream

[jira] [Created] (FLINK-20380) Problems with creating iceberg Catalog (Catalog -type=hive) using flink-SQL shell

2020-11-26 Thread tianyu guo (Jira)
tianyu guo created FLINK-20380: -- Summary: Problems with creating iceberg Catalog (Catalog -type=hive) using flink-SQL shell Key: FLINK-20380 URL: https://issues.apache.org/jira/browse/FLINK-20380

Re: Any python example with json data from Kafka using flink-statefun

2020-06-16 Thread Tzu-Li (Gordon) Tai
(forwarding this to user@ as it is more suited to be located there) Hi Sunil, With remote functions (using the Python SDK), messages sent to / from them must be Protobuf messages. This is a requirement since remote functions can be written in any language, and we use Protobuf as a means for

Re: Any python example with json data from Kafka using flink-statefun

2020-06-16 Thread Sunil
checking to see if this is possible currently. Read json data from kafka topic => process using statefun => write out to kafka in json format. I could have a separate process to read the source json data convert to protobuf into another kafka topic but it sounds in-efficient. e.g. Read json

Re: Any python example with json data from Kafka using flink-statefun

2020-06-15 Thread Sunil Sattiraju
Thanks Igal, I dont have control over the data source inside kafka ( current kafka topic contains either json or avro formats only, i am trying to reproduce this scenario using my test data generator ). is it possible to convert the json to proto at the receiving end of statefun applicaiton?

Re: Any python example with json data from Kafka using flink-statefun

2020-06-15 Thread Igal Shilman
Hi, The values must be valid encoded Protobuf messages [1], while in your attached code snippet you are sending utf-8 encoded JSON strings. You can take a look at this example with a generator that produces Protobuf messages [2][3] [1]

Any python example with json data from Kafka using flink-statefun

2020-06-15 Thread Sunil Sattiraju
Hi, Based on the example from https://github.com/apache/flink-statefun/tree/master/statefun-examples/statefun-python-greeter-example I am trying to ingest json data in kafka, but unable to achieve based on the examples. event-generator.py def produce(): request = {} request['id'] =

[jira] [Created] (FLINK-16791) Could not deploy Yarn job cluster when using flink-s3-fs-hadoop-1.10.0.jar

2020-03-25 Thread yanxiaobin (Jira)
yanxiaobin created FLINK-16791: -- Summary: Could not deploy Yarn job cluster when using flink-s3-fs-hadoop-1.10.0.jar Key: FLINK-16791 URL: https://issues.apache.org/jira/browse/FLINK-16791 Project

Re: Best coding practises guide while programming using flink apis

2019-10-10 Thread Yun Tang
, September 23, 2019 10:25 To: Terry Wang ; user Cc: dev Subject: Re: Best coding practises guide while programming using flink apis Thanks Terry. I would need some volunteers to speak about their use cases and the best practised they have been following around flink. ―DK On Sun, 22 Sep 2019 at 5

Re: Best coding practises guide while programming using flink apis

2019-09-22 Thread Terry Wang
Hi, Deepak~ I appreciate your idea and cc to dev mail too. Best, Terry Wang > 在 2019年9月22日,下午2:12,Deepak Sharma 写道: > > Hi All > I guess we need to put some examples in the documentation around best coding > practises , concurrency , non blocking IO and design patterns while writing >

[jira] [Created] (FLINK-12206) cannot query nested fields using Flink SQL

2019-04-15 Thread Yu Yang (JIRA)
Yu Yang created FLINK-12206: --- Summary: cannot query nested fields using Flink SQL Key: FLINK-12206 URL: https://issues.apache.org/jira/browse/FLINK-12206 Project: Flink Issue Type: Bug

[jira] [Created] (FLINK-10660) Failed index data to Elastic search using Flink-connector-elasticsearch with JUNIT test

2018-10-24 Thread JIRA
孙达 created FLINK-10660: -- Summary: Failed index data to Elastic search using Flink-connector-elasticsearch with JUNIT test Key: FLINK-10660 URL: https://issues.apache.org/jira/browse/FLINK-10660 Project: Flink

Re: Need help regarding MongoDB oplog tailing using flink.

2018-06-05 Thread Chesnay Schepler
Ezmlm, I have gone through flink documentation and found it quit interesting but I am stuck in one task i.e. mongodb streaming oplog using flink. Can you help me to figure out this? --- *Amol Suryawanshi* Java Developer am...@iprogrammer.com

Need help regarding MongoDB oplog tailing using flink.

2018-06-05 Thread Amol S - iProgrammer
Hello Ezmlm, I have gone through flink documentation and found it quit interesting but I am stuck in one task i.e. mongodb streaming oplog using flink. Can you help me to figure out this? --- *Amol Suryawanshi* Java Developer am...@iprogrammer.com

Re: [Dev] Issue related to using Flink DataSet methods

2017-03-01 Thread Pawan Manishka Gunarathna
and that was done.So before going to do processing using flink, I need to read that table data. That's the thing I mean there. My nextRecord() method will return our own data type called *Record. *Here following are some sample Records in my data table

Re: [Dev] Issue related to using Flink DataSet methods

2017-03-01 Thread Xingcan Cui
Hi Pawan, @Fabian was right and I thought it was stream environment. Sorry for that. What do you mean by `read the available records of my datasource`? How do you implement the nextRecord() method in DASInputFormat? Best, Xingcan On Wed, Mar 1, 2017 at 4:45 PM, Fabian Hueske

Re: [Dev] Issue related to using Flink DataSet methods

2017-03-01 Thread Fabian Hueske
Hi Pawan, in the DataSet API DataSet.print() will trigger the execution (you do not need to call ExecutionEnvironment.execute()). The DataSet will be printed on the standard out of the process that submits the program. This does only work for small DataSets. In general print() should only be used

Re: [Dev] Issue related to using Flink DataSet methods

2017-02-28 Thread Pawan Manishka Gunarathna
Hi, So how can I read the available records of my datasource. I saw in some examples that print() method will print the available data of that datasource. ( like files ) Thanks, Pawan On Wed, Mar 1, 2017 at 11:30 AM, Xingcan Cui wrote: > Hi Pawan, > > in Flink, most of

Re: [Dev] Issue related to using Flink DataSet methods

2017-02-28 Thread Xingcan Cui
Hi Pawan, in Flink, most of the methods for DataSet (including print()) will just add operators to the plan but not really run it. If the DASInputFormat has no error, you can run the plan by calling environment.execute(). Best, Xingcan On Wed, Mar 1, 2017 at 12:17 PM, Pawan Manishka Gunarathna

[Dev] Issue related to using Flink DataSet methods

2017-02-28 Thread Pawan Manishka Gunarathna
Hi, I have implemented a Flink InputFormat interface related to my datasource. It have our own data type as *Record*. So my class seems as follows, public class DASInputFormat implements InputFormat { } So when I executed the print() method, my console shows the Flink

Using Flink

2016-10-03 Thread Govindarajan Srinivasaraghavan
Hi, I have few questions on how I need to model my use case in flink. Please advise. Thanks for the help. - I'm currently running my flink program on 1.2 SNAPSHOT with kafka source and I have checkpoint enabled. When I look at the consumer offsets in kafka it appears to be stagnant and there

Re: Using Flink Streaming to write to multiple output files in HDFS

2015-11-09 Thread Robert Metzger
Hey Andra, were you able to answer your questions from Aljoschas and Fabians links? Flink's streaming file sink is quite unique (compared to Flume) because it supports exactly-once semantics. Also, the performance compared to Storm is probably much better, so you can save a lot of resources.

Re: Using Flink Streaming to write to multiple output files in HDFS

2015-11-09 Thread Nyamath Ulla Khan
Hi Andra, You could find very intersting example for Flink streaming and with Kafka (input/Output). https://flink.apache.org/news/2015/02/09/streaming-example.html. http://dataartisans.github.io/flink-training/exercises/ ( Contains most the different Operator Example)

Using Flink Streaming to write to multiple output files in HDFS

2015-10-21 Thread Andra Lungu
Hey guys, Long time, no see :). I recently started a new job and it involves performing a set of real-time data analytics using Apache Kafka, Storm and Flume. What happens, on a very high level, is that set of signals is collected, stored into a Kafka topic and then Storm is used to filter

Re: Using Flink Streaming to write to multiple output files in HDFS

2015-10-21 Thread Aljoscha Krettek
Hi, the documentation has a guide about the Streaming API: https://ci.apache.org/projects/flink/flink-docs-master/apis/streaming_guide.html This also contains a section about the rolling (HDFS) FileSystem sink:

Re: Using Flink Streaming to write to multiple output files in HDFS

2015-10-21 Thread Fabian Hueske
There are also training slides and programming exercises (incl. reference solutions) for the DataStream API at --> http://dataartisans.github.io/flink-training/ Cheers, Fabian 2015-10-21 14:03 GMT+02:00 Aljoscha Krettek : > Hi, > the documentation has a guide about the

[jira] [Created] (FLINK-2702) Add an implementation of distributed copying utility using Flink

2015-09-18 Thread Stephan Ewen (JIRA)
Stephan Ewen created FLINK-2702: --- Summary: Add an implementation of distributed copying utility using Flink Key: FLINK-2702 URL: https://issues.apache.org/jira/browse/FLINK-2702 Project: Flink