Hi all
When I use SQL with UDTF, when I call the tableEnv.sqlQuery
() method, I throw the following error: Rowtime attributes must not be in the
input rows of a regular join. As a workaround you can cast the time attributes
of input tables to TIMESTAMP before. I used the to_timestamp functi
uh... OK, thanks! 😅
-0xe1a
On Fri, Dec 18, 2020 at 11:20 AM Arvid Heise wrote:
> Hi Alex,
>
> not entirely sure how you reached your conclusion but afaik side output is
> dispatched through the output tag.
>
> There are even tests in the code base [1] that use multiple outputs of the
> same typ
Hi Alex,
not entirely sure how you reached your conclusion but afaik side output is
dispatched through the output tag.
There are even tests in the code base [1] that use multiple outputs of the
same type.
[1]
https://github.com/apache/flink/blob/1a08548a209167cafeeba1ce602fe8d542994be5/flink-str
Hey folks,
I have a program that demultiplexes input records from a shared prefix
stream onto some number of suffix streams, which are allocated on boot
based on configuration.
At the moment I'm just duplicating the input records, and filtering out the
wrong records in each suffix stream, but it'
Hello,
Unfortunately, this driver is not currently supported by the Table API [1].
You can implement a dialect for it [2] and construct JdbcTableSource [3]
manually.
Alternatively, you can switch to the DataStream API and use JdbcInputFormat
[4] which doesn't require dialect.
I'm also pulling in
Hello,
I'm trying to create a `StreamTableSource` for Snowflake using
`JdbcTableSourceSinkFactory.createStreamTableSource` (in package
org.apache.flink.connector.jdbc.table) but it fails with the following
error message due to `JdbcDialects` not having a dialect for
Snowflake.
My goal is to fully
Thanks a lot to everyone who has contributed to this release and in
particular to Gordon and Xintong who did a great job.
Cheers,
Till
On Fri, Dec 18, 2020 at 12:34 PM Paul Lam wrote:
> Well done! Thanks to Gordon and Xintong, and everyone that contributed to
> the release.
>
> Best,
> Paul Lam
Well done! Thanks to Gordon and Xintong, and everyone that contributed to the
release.
Best,
Paul Lam
> 2020年12月18日 19:20,Xintong Song 写道:
>
> The Apache Flink community is very happy to announce the release of Apache
> Flink 1.11.3, which is the third bugfix release for the Apache Flink 1.11
The Apache Flink community is very happy to announce the release of Apache
Flink 1.11.3, which is the third bugfix release for the Apache Flink 1.11
series.
Apache Flink® is an open-source stream processing framework for
distributed, high-performing, always-available, and accurate data streaming
a
Just a quick idea for using C. I'd create a generic Java UDF and use JNA
[1] in it to call a function written in C.
UDF could be something like invokeC("libraryName", "functionName", args)
and would get JNA's Function [2] and invoke it accordingly.
[1] https://github.com/java-native-access/jna
[2
Hi Jiazhi,
Could you share table definitions and both queries?
Regards,
Roman
On Fri, Dec 18, 2020 at 4:39 AM ゞ野蠻遊戲χ wrote:
> Hi all
> When I use SQL with UDTF, when I call the tableEnv.sqlQuery ()
> method, I throw the following error: Rowtime attributes must not be in the
> input rows
The stacktrace looks similar to
https://issues.apache.org/jira/browse/HIVE-14483
However, it should be fixed in the version used in your setup.
Jingsong Li can you take a look at this error?
Regards,
Roman
On Thu, Dec 17, 2020 at 3:57 PM house-张浩 <312421...@qq.com> wrote:
> when i use pyflink
12 matches
Mail list logo