Github user twalthr commented on the issue:
https://github.com/apache/flink/pull/2653
Thanks. Merging...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Thanks @twalthr , I will add the documentation ASAP.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user twalthr commented on the issue:
https://github.com/apache/flink/pull/2653
@wuchong thanks for updating the PR! It looks good now. I will rebase it,
quickly scan over the code again and merge it tomorrow.
However, we should fix FLINK-5223 as soon as possible. It wou
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @fhueske @twalthr , I have updated the PR, please review it again when
you are available.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub a
Github user twalthr commented on the issue:
https://github.com/apache/flink/pull/2653
I have already reviewed half of the new code. I will review the rest
tomorrow.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If you
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @fhueske @twalthr , could you have a look at this PR again ? I have fix
the conflicts again and squashed the commits.
---
If your project is set up for it, you can reply to this email and have
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @fhueske @twalthr , I have addressed all the comments and made the
following changes:
1. Forbid TableFunction implemented by Scala object, since the `collect` is
called on a singleton. It
Github user fhueske commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @wuchong, I think @twalthr's main concerns are not about the parser, but
rather that expressions and logical nodes are mixed, especially in the Scala
Table API where the `TableFunction` is implici
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Regarding to the mixing parser Expression and Logical Node, how about to
create a `LogicalParser` which is used to parse string to LogicalNode ? This
can separate expressions and logical nodes, and k
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Thank you @fhueske @twalthr for the review, I will update the PR in this
weekend.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If y
Github user fhueske commented on the issue:
https://github.com/apache/flink/pull/2653
I think we should convert the tests for the Java String-based Table API to
check the logical plans of queries against the logical plan of a corresponding
Scala Expression-based Table API query.
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @fhueske , you mentioned two ways to reduce IT cases. One is comparing
the logical plans of two tables, this can reduce Java IT cases. Another is
using `TableTestBase` tool to write unit tests, th
Github user fhueske commented on the issue:
https://github.com/apache/flink/pull/2653
Maybe one more thing to add. @twalthr recently added some tooling to write
unit tests that check the translation from SQL / Table API to `DataSetRel` and
`DataStreamRel` nodes, i.e., exclude the fina
Github user twalthr commented on the issue:
https://github.com/apache/flink/pull/2653
@wuchong thanks for the PR. I will also review it the next days.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does n
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Sounds great. Thanks Fabian .
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wi
Github user fhueske commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @wuchong, I did a first high-level pass over the PR. From what I've seen
it looks really good and I do not expect that major changes are necessary ð.
Will do a more thorough review in the next
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @sunjincheng121 , thank you for the reviewing. I will update the PR
according to your comments.
---
If your project is set up for it, you can reply to this email and have your
reply appear on G
Github user fhueske commented on the issue:
https://github.com/apache/flink/pull/2653
Thanks for the update @wuchong! Will have a look at this PR.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not h
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
@fhueske I have updated this PR for the following changes.
1. Remove CROSS/OUTER APPLY support in SQL
2. Change Java Table API from `.crossApply("split(c)", "s")` to
`.corssApply("split(c
Github user shaoxuan-wang commented on the issue:
https://github.com/apache/flink/pull/2653
Jack,
Giving this for the second thought, I feel we can just use LATERAL and
INNER/LEFT JOIN (which is already supported by Calcite) instead of
introducing "CROSS/OUTER APPLY" . Let u
Github user fhueske commented on the issue:
https://github.com/apache/flink/pull/2653
HI @wuchong, thanks for this feature. This looks really interesting :-)
I haven't looked at the PR in detail yet. However, I would like to avoid
customizing the parser if possible. I haven't
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @fhueske @twalthr , do you have any thoughts on this pull request ?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your projec
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Because SQL doesn't have standard user-defined table function, we introduce
MS SQL Server's `CROSS/OUTER APPLY` to support UDTF , see
https://technet.microsoft.com/en-us/library/ms175156(v=sql.105).a
Github user wuchong commented on the issue:
https://github.com/apache/flink/pull/2653
Hi @twalthr @fhueske , it will be great if you can review this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
24 matches
Mail list logo