Re: [DISCUSS] Hive dialect shouldn't fall back to Flink's default dialect

2023-05-29 Thread Rui Li
k forwards to your feedback. I'd love to listen the feedback from > > community to take the next steps. > > > > [1]: > > > https://github.com/apache/flink/blob/678370b18e1b6c4a23e5ce08f8efd05675a0cc17/flink-connectors/flink-connector-hive/src/main/java/org/apache/flink/table/planner/delegation/hive/HiveParser.java#L348 > > [2]:https://issues.apache.org/jira/browse/FLINK-26681 > > [3]:https://issues.apache.org/jira/browse/FLINK-31413 > > [4]:https://issues.apache.org/jira/browse/FLINK-30064 > > > > > > > > Best regards, > > Yuxia > > > > > -- > > Best, > Benchao Li > -- Best regards! Rui Li

Re:

2021-10-14 Thread Rui Li
gt; <https://blog.csdn.net/weibokong789/article/details/106427481> and got > the same result (no real idea what I'm doing here, just trying some things.) > > Is there something more I have to do to use HiveCatalog with a kerberized > Hive Metastore? Should Flink support this out of the box? > > Thanks! > - Andrew Otto > SRE, Wikimedia Foundation > > > -- Best regards! Rui Li

Re: [ANNOUNCE] Apache Flink 1.13.0 released

2021-05-06 Thread Rui Li
>> >> distributed, high-performing, always-available, and accurate data >> streaming >> >> applications. >> >> >> >> The release is available for download at: >> >> https://flink.apache.org/downloads.html >> >> >> >> Please check out the release blog post for an overview of the >> >> improvements for this bugfix release: >> >> https://flink.apache.org/news/2021/05/03/release-1.13.0.html >> >> >> >> The full release notes are available in Jira: >> >> >> >> >> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12349287 >> >> >> >> We would like to thank all contributors of the Apache Flink community >> who >> >> made this release possible! >> >> >> >> Regards, >> >> Guowei & Dawid >> >> >> >> >> > >> > >> > -- Best regards! Rui Li

Re: Using Hive UDFs

2021-04-28 Thread Rui Li
gt; flink_gaia.ST_GeomFromText already exists in Catalog flink-hive. >> >> at >> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalogFunction(TableEnvironmentImpl.java:1459) >> >> at >> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironme

Re: Using Hive UDFs

2021-04-27 Thread Rui Li
812) > > at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246) > > at > org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054) > > at > org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.

Re: Using Hive UDFs

2021-04-27 Thread Rui Li
;> >> at javax.security.auth.Subject.doAs(Subject.java:422) >> >> at >> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1682) >> >> at >> org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41) >> >> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132) >> >> Caused by: org.apache.flink.table.api.ValidationException: Function >> flink_gaia.ST_GeomFromText already exists in Catalog flink-hive. >> >> at >> org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalogFunction(TableEnvironmentImpl.java:1459) >> >> at >> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeOperation(TableEnvironmentImpl.java:1009) >> >> at >> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:666) >> >> at com.skt.chiron.FlinkApp.main(FlinkApp.java:58) >> >> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> >> at >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) >> >> at >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> >> at java.lang.reflect.Method.invoke(Method.java:498) >> >> at >> org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:349) >> >> ... 11 more >> >> (snip) >> >> >> >> I hope to find out why the functions are missing. Flink(Ver. 1.12.2) job >> cluster is running on Kubernetes cluster via flink operator and the >> standalone metastore is running for only the Flink cluster without Hive >> deployments. >> >> >> Thanks, >> >> Youngwoo >> >> 1. https://github.com/Esri/spatial-framework-for-hadoop >> > -- Best regards! Rui Li

Re: Why is Hive dependency flink-sql-connector-hive not available on Maven Central?

2021-04-06 Thread Rui Li
> I am able to find the jar from Maven central. See updates in the > StackOverflow post. > > Thank you! > > Best, > Yik San > > On Tue, Apr 6, 2021 at 4:05 PM Tzu-Li (Gordon) Tai > wrote: > >> Hi, >> >> I'm pulling in Rui Li (cc'ed)

Re: Is there a way to avoid submit hive-udf's resources when we submit a job?

2020-09-22 Thread Rui Li
id submit udf's resources when we submit a > > job.Is it possible? > > > > > > > > -- > > Sent from: > http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/ > > > > -- Cheers, Rui Li

Re: [ANNOUNCE] New PMC member: Dian Fu

2020-08-27 Thread Rui Li
>>> >>> Please join me in congratulating Dian Fu for becoming a Flink PMC Member! >>> >>> Best, >>> Jincheng(on behalf of the Flink PMC) >>> >> -- Best regards! Rui Li

Re: [ANNOUNCE] Apache Flink 1.11.1 released

2020-07-22 Thread Rui Li
> > > >> >> > > > >> Apache Flink® is an open-source stream processing framework for >> > > distributed, high-performing, always-available, and accurate data >> > streaming >> > > applications. >> > > > >> >> > > > >> The release is available for download at: >> > > > >> https://flink.apache.org/downloads.html >> > > > >> >> > > > >> Please check out the release blog post for an overview of the >> > > improvements for this bugfix release: >> > > > >> https://flink.apache.org/news/2020/07/21/release-1.11.1.html >> > > > >> >> > > > >> The full release notes are available in Jira: >> > > > >> >> > > >> > >> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12348323 >> > > > >> >> > > > >> We would like to thank all contributors of the Apache Flink >> > community >> > > who made this release possible! >> > > > >> >> > > > >> Regards, >> > > > >> Dian >> > > > > >> > > > >> > > >> > >> >> >> -- >> >> Konstantin Knauf >> >> https://twitter.com/snntrable >> >> https://github.com/knaufk >> >> >> > > -- > Best, Jingsong Lee > -- Best regards! Rui Li

Re: FileNotFoundException when writting Hive orc tables

2020-07-21 Thread Rui Li
orc table has no complex(list,map,row) types, you can try to set > `table.exec.hive.fallback-mapred-writer` to false in TableConfig. And Hive > sink will use ORC native writer, it is a work-around way. > > About this error, I think this is a bug for Hive 1.1 ORC. I will try to > re-pro

Re: FileNotFoundException when writting Hive orc tables

2020-07-21 Thread Rui Li
reaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:755) > at > org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100) > at > org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63) > at > org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:201) > > > Best, > Paul Lam > > -- Best regards! Rui Li

Re: [ANNOUNCE] Apache Flink 1.11.0 released

2020-07-07 Thread Rui Li
nk all contributors of the Apache Flink community who made > this release possible! > > Cheers, > Piotr & Zhijiang > -- Best regards! Rui Li

Re: Not able to implement an usecase

2020-05-12 Thread Rui Li
FAIK, yes, you can write streams. > > I'm pulling in Jingsong Li and Rui Li as they might know better. > > Regards, > Roman > > > On Mon, May 11, 2020 at 10:21 PM Jaswin Shah > wrote: > >> If I go with table apis, can I write the streams to hive or it is only &

Re: How to to in Flink to support below HIVE SQL

2020-04-19 Thread Rui Li
lanner also supports this both streaming mode > and batch mode. > Rlike => Blink planner has REGEXP [1] built-in function which I think is > similar to Hive's Rlike? > LATERAL VIEW => This is called UDTF in Flink, see how to use UDTF in docs > [3] "Join with Table Funct

Re: [ANNOUNCE] Jingsong Lee becomes a Flink committer

2020-02-20 Thread Rui Li
very > active in both dev > and user mailing lists, helped discussing designs and answering users > questions, also > helped to verify various releases. > > Congratulations Jingsong! > > Best, Kurt > (on behalf of the Flink PMC) > > > -- Best regards! Rui Li

Re: 1.9 timestamp type default

2020-02-13 Thread Rui Li
://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/table/types.html#timestamp > , > the default java bridge time for timestamp is java.time.LocalDateTime. Is > there a setting that can change it to use > java.sql.Timestamp instead? > > Thanks, > Fanbin > -- Best regards! Rui Li

Re: [ANNOUNCE] Dian Fu becomes a Flink committer

2020-01-16 Thread Rui Li
the release. > > Please join in me congratulating Dian for becoming a Flink committer ! > > Best, > Jincheng(on behalf of the Flink PMC) > -- Best regards! Rui Li

Re: [DISCUSS] Set default planner for SQL Client to Blink planner in 1.10 release

2020-01-05 Thread Rui Li
t;>>>>> >> to blink planner manually when every time start a SQL CLI. And >>>>>>>> it's >>>>>>>> >> surprising to see unsupported >>>>>>>> >> exception if they trying out the new features but not switch >>>>>>>> planner. >>>>>>>> >> >>>>>>>> >> SQL CLI is a very important entrypoint for trying out new >>>>>>>> feautures and >>>>>>>> >> prototyping for users. >>>>>>>> >> In order to give new planner more exposures, I would like to >>>>>>>> suggest to set >>>>>>>> >> default planner >>>>>>>> >> for SQL Client to Blink planner before 1.10 release. >>>>>>>> >> >>>>>>>> >> The approach is just changing the default SQL CLI yaml >>>>>>>> configuration[5]. In >>>>>>>> >> this way, the existing >>>>>>>> >> environment is still compatible and unaffected. >>>>>>>> >> >>>>>>>> >> Changing the default planner for the whole Table API & SQL is >>>>>>>> another topic >>>>>>>> >> and is out of scope of this discussion. >>>>>>>> >> >>>>>>>> >> What do you think? >>>>>>>> >> >>>>>>>> >> Best, >>>>>>>> >> Jark >>>>>>>> >> >>>>>>>> >> [1]: >>>>>>>> >> >>>>>>>> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/streaming/joins.html#join-with-a-temporal-table >>>>>>>> >> [2]: >>>>>>>> >> >>>>>>>> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/sql/queries.html#top-n >>>>>>>> >> [3]: >>>>>>>> >> >>>>>>>> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/sql/queries.html#deduplication >>>>>>>> >> [4]: >>>>>>>> >> >>>>>>>> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/tuning/streaming_aggregation_optimization.html >>>>>>>> >> [5]: >>>>>>>> >> >>>>>>>> https://github.com/apache/flink/blob/master/flink-table/flink-sql-client/conf/sql-client-defaults.yaml#L100 >>>>>>>> > >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> -- >>>>>>> Best, Jingsong Lee >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> Best Regards >>>>>> >>>>>> Jeff Zhang >>>>>> >>>>> >>>>> >>>>> -- >>>>> Benoît Paris >>>>> Ingénieur Machine Learning Explicable >>>>> Tél : +33 6 60 74 23 00 >>>>> http://benoit.paris >>>>> http://explicable.ml >>>>> >>>> >> >> -- >> >> Benchao Li >> School of Electronics Engineering and Computer Science, Peking University >> Tel:+86-15650713730 >> Email: libenc...@gmail.com; libenc...@pku.edu.cn >> >> -- Best regards! Rui Li

Re: using thin jar to replace fat jar on yarn cluster mode

2019-12-22 Thread Rui Li
ask jar will be tens of KB. > > -- > zjfpla...@hotmail.com > -- Best regards! Rui Li

Re: Error "Failed to load native Mesos library from" when I run Flink on a compiled version of Apache Mesos

2019-09-17 Thread Rui Li
Failed to load native Mesos library from > /usr/java/packages/lib/amd64:/usr/lib/x86_64-linux-gnu/jni:/lib/x86_64-linux-gnu:/usr/lib/x86_64-linux-gnu:/usr/lib/jni:/lib:/usr/lib > > Thanks, > Felipe > > *--* > *-- Felipe Gutierrez* > > *-- skype: felipe.o.gutierrez* > *--* *https://felipeogutierrez.blogspot.com > <https://felipeogutierrez.blogspot.com>* > -- Best regards! Rui Li

Re: Extending Flink's SQL-Parser

2019-09-16 Thread Rui Li
me adding a new > SELECT-syntax? I already came across the parser extension test but it > didn't give me the answers I was looking for. > > > > Thanks for your help! > > > > Regards, > > Dominik Gröninger > > >  > -- Best regards! Rui Li