[jira] [Created] (FLINK-28820) Pulsar Connector PulsarSink performance issue when delivery guarantee is not NONE

2022-08-04 Thread Simon Su (Jira)
Simon Su created FLINK-28820: Summary: Pulsar Connector PulsarSink performance issue when delivery guarantee is not NONE Key: FLINK-28820 URL: https://issues.apache.org/jira/browse/FLINK-28820 Project

Checkpoint in FlinkSQL

2019-11-04 Thread Simon Su
Hi All Does current Flink support to set checkpoint properties while using Flink SQL ? For example, statebackend choices, checkpoint interval and so on ... Thanks, SImon

Re: RemoteEnvironment cannot execute job from local.

2019-10-31 Thread Simon Su
n a remote cluster from the IDE you need to first build the jar containing your user code. This jar needs to passed to createRemoteEnvironment() so that the Flink client knows which jar to upload. Hence, please make sure that /tmp/myudf.jar contains your user code. Cheers, Till On Thu, Oct 31, 20

RemoteEnvironment cannot execute job from local.

2019-10-31 Thread Simon Su
Hi all I want to test to submit a job from my local IDE and I deployed a Flink cluster in my vm. Here is my code from Flink 1.9 document and add some of my parameters. public static void main(String[] args) throws Exception { ExecutionEnvironment env = ExecutionEnvironment

Flink 1.9 build failed

2019-08-26 Thread Simon Su
Hi all I’m trying to build flink 1.9 release branch, it raises the error like: Could not resolve dependencies for project org.apache.flink:flink-s3-fs-hadoop:jar:1.9-SNAPSHOT: Could not find artifact org.apache.flink:flink-fs-hadoop-shaded:jar:tests:1.9-SNAPSHOT in maven-ali

Re: [Discuss] What should the "Data Source" be translated into Chinese

2019-08-13 Thread Simon Su
更倾向不去翻译Data Source和Data Sink, 通过用中文对其做解释即可 Thanks, SImon On 08/13/2019 18:07, wrote: How about translate "data sink" into “数据漕” 漕,读作:cáo。汉字基本字义指通过水道运输粮食:漕运|漕粮。==> https://baike.baidu.com/item/%E6%BC%95?forcehttps=1%3Ffr%3Dkg_hanyu - 原始邮件 - 发件人:Kurt Young 收件人:dev , user-zh

Re: Flink cannot recognized catalog set by registerCatalog.

2019-08-13 Thread Simon Su
OK, Thanks Jark Thanks, SImon On 08/13/2019 14:05,Jark Wu wrote: Hi Simon, This is a temporary workaround for 1.9 release. We will fix the behavior in 1.10, see FLINK-13461. Regards, Jark On Tue, 13 Aug 2019 at 13:57, Simon Su wrote: Hi Jark Thanks for your reply. It’s weird

Re: Flink cannot recognized catalog set by registerCatalog.

2019-08-12 Thread Simon Su
he default catalog. To create table in your custom catalog, you could use tableEnv.sqlUpdate("create table "). Thanks, Xuefu On Mon, Aug 12, 2019 at 6:17 PM Simon Su wrote: > Hi Xuefu > > Thanks for you reply. > > Actually I have tried it as your advises. I have

Re: Flink cannot recognized catalog set by registerCatalog.

2019-08-12 Thread Simon Su
alog, you could call tableEnv.useCatalog() and .useDatabase(). As an alternative, you could fully qualify your table name with a "catalog.db.table" syntax without switching current catalog/database. Please try those and let me know if you find new problems. Thanks, Xuefu On Mon, Aug 12, 20

Flink cannot recognized catalog set by registerCatalog.

2019-08-12 Thread Simon Su
Hi All I want to use a custom catalog by setting the name “ca1” and create a database under this catalog. When I submit the SQL, and it raises the error like : Exception in thread "main" org.apache.flink.table.api.ValidationException: SQL validation failed. From line 1, column 98 to

[jira] [Created] (FLINK-13492) BoundedOutOfOrderTimestamps cause Watermark's timestamp leak

2019-07-30 Thread Simon Su (JIRA)
Simon Su created FLINK-13492: Summary: BoundedOutOfOrderTimestamps cause Watermark's timestamp leak Key: FLINK-13492 URL: https://issues.apache.org/jira/browse/FLINK-13492 Project: Flink Issue

Re: [DISCUSS] Removing the flink-mapr-fs module

2019-07-29 Thread Simon Su
+1 to remove it. Thanks, SImon On 07/29/2019 21:00,Till Rohrmann wrote: +1 to remove it. On Mon, Jul 29, 2019 at 1:27 PM Stephan Ewen wrote: +1 to remove it One should still be able to use MapR in the same way as any other vendor Hadoop distribution. On Mon, Jul 29, 2019 at 12:22 PM