Timo Walther created FLINK-12924:
Summary: Introduce basic type inference interfaces
Key: FLINK-12924
URL: https://issues.apache.org/jira/browse/FLINK-12924
Project: Flink
Issue Type: Sub
Timo Walther created FLINK-12899:
Summary: Introduce a resolved expression with data type
Key: FLINK-12899
URL: https://issues.apache.org/jira/browse/FLINK-12899
Project: Flink
Issue Type
Timo Walther created FLINK-12874:
Summary: Improve the semantics of zero length character strings
Key: FLINK-12874
URL: https://issues.apache.org/jira/browse/FLINK-12874
Project: Flink
Issue
Timo Walther created FLINK-12769:
Summary: Simplify expression design for symbols
Key: FLINK-12769
URL: https://issues.apache.org/jira/browse/FLINK-12769
Project: Flink
Issue Type: New
Timo Walther created FLINK-12726:
Summary: Fix ANY type serialization
Key: FLINK-12726
URL: https://issues.apache.org/jira/browse/FLINK-12726
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-12711:
Summary: Separate function implementation and definition
Key: FLINK-12711
URL: https://issues.apache.org/jira/browse/FLINK-12711
Project: Flink
Issue Type
Timo Walther created FLINK-12710:
Summary: Unify built-in and user-defined functions in the API
Key: FLINK-12710
URL: https://issues.apache.org/jira/browse/FLINK-12710
Project: Flink
Issue
Thanks for being the release managers, Kurt and Gordon!
From the Table & SQL API side, there are still a lot of open issues
that need to be solved to decouple the API from a planner and enable the
Blink planner. Also we need to make sure that the Blink planner supports
at least everything of
I quickly scanned the changes and could not spot any issues.
+1
Am 27.05.19 um 13:36 schrieb Chesnay Schepler:
+1
* git tag exists
* no binaries in release
* relocated jackson no longer bundled twice in hadoop jars
* jackson dependency tree exists
* netty-tcnative-static not part of release
*
Timo Walther created FLINK-12566:
Summary: Remove row interval type
Key: FLINK-12566
URL: https://issues.apache.org/jira/browse/FLINK-12566
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-12393:
Summary: Add the user-facing classes of the new type system
Key: FLINK-12393
URL: https://issues.apache.org/jira/browse/FLINK-12393
Project: Flink
Issue
on a way to committership.
@Konstantin: Trivial fixes (typos, docs, javadocs, ...) should still be
possible as "hotfixes".
On Mon, Apr 15, 2019 at 3:14 PM Timo Walther
wrote:
I think this really depends on the contribution.
Sometimes "triviality" means that people just wan
Hi Fabian,
I think the mentioning of FLIP-32 is still valid as it has not yet been
completed. I would like to add more details to it and also add more
description about FLINK-11439. Big ongoing refactorings such as the type
system rework should also be mentioned there. I will prepare a
Timo Walther created FLINK-12254:
Summary: Expose the new type system through the API
Key: FLINK-12254
URL: https://issues.apache.org/jira/browse/FLINK-12254
Project: Flink
Issue Type: Sub
Timo Walther created FLINK-12253:
Summary: Setup a class hierarchy for the new type system
Key: FLINK-12253
URL: https://issues.apache.org/jira/browse/FLINK-12253
Project: Flink
Issue Type
Timo Walther created FLINK-12251:
Summary: Rework the Table API & SQL type system
Key: FLINK-12251
URL: https://issues.apache.org/jira/browse/FLINK-12251
Project: Flink
Issue
?
Looking forward to this change and would love to contribute in anyway I can!
Best,
Rong
On Thu, Mar 28, 2019 at 3:25 AM Timo Walther wrote:
Maybe to give some background about Dawid's latest email:
Kurt raised some good points regarding the conversion of data types at
the boundaries
Hi Artsem,
having a catalog support for Confluent Schema Registry would be a great
addition. Although the implementation of FLIP-30 is still ongoing, we
merged the stable interfaces today [0]. This should unblock people from
contributing new catalog implementations. So you could already start
y contributor
could
just ask the committer (who merged those contributions) about
contributor
permissions.
Best,
Andrey
On Wed, Apr 10, 2019 at 3:58 AM Robert Metzger
wrote:
I'm +1 on option 1.
On Tue, Apr 9, 2019 at 1:58 AM Timo Walther
wrote:
Hi everyone,
I'd like to bring up this d
e wrote:
Hi,
I'm not sure about adding an additional stage.
Who's going to decide when to "promote" a user to a contributor, i.e.,
grant assigning permission?
Best, Fabian
Am Do., 14. März 2019 um 13:50 Uhr schrieb Timo Walther <
twal...@apache.org
:
Hi Robert,
I also like t
physical representation, I think we
should aim to introduce that and keep it separated.
Best,
Dawid
On 28/03/2019 08:51, Kurt Young wrote:
Big +1 to this! I left some comments in google doc.
Best,
Kurt
On Wed, Mar 27, 2019 at 11:32 PM Timo Walther wrote:
Hi everyone,
some of you might
Hi everyone,
some of you might have already read FLIP-32 [1] where we've described an
approximate roadmap of how to handle the big Blink SQL contribution and
how we can make the Table & SQL API equally important to the existing
DataStream API.
As mentioned there (Advance the API and Unblock
the string-based API in 1.9 or make the
decision
in 1.10 after some feedbacks ?
On Thu, 21 Mar 2019 at 21:32, Timo Walther wrote:
Thanks for your feedback Rong and Jark.
@Jark: Yes, you are right that the string-based API is used quite a
lot.
On the other side, the potential user base in
Hi everyone,
I also tried to summarize the previous discussion and would add an
additional `Ecosystem` component. I would suggest:
Table SQL / API
Table SQL / Client
Table SQL / Legacy Planner
Table SQL / Planner
Table SQL / Runtime
Table SQL / Ecosystem (such as table connectors, formats,
doc. and also some features that I
think will
be beneficial to the final outcome. Please kindly take a look @Timo.
Many thanks,
Rong
On Mon, Mar 18, 2019 at 7:15 AM Timo Walther mailto:twal...@apache.org>> wrote:
> Hi everyone,
>
> some of you might h
Hi everyone,
some of you might have already noticed the JIRA issue that I opened
recently [1] about introducing a proper Java expression DSL for the
Table API. Instead of using string-based expressions, we should aim for
a unified, maintainable, programmatic Java DSL.
Some background: The
Timo Walther created FLINK-11921:
Summary: Upgrade Calcite dependency to 1.19
Key: FLINK-11921
URL: https://issues.apache.org/jira/browse/FLINK-11921
Project: Flink
Issue Type: Improvement
es
more
effort/participation from committer's side. From my own side, it's
exciting
to
see our committers become more active :-)
Best,
tison.
Chesnay Schepler 于2019年2月27日周三 下午5:06写道:
We currently cannot change the JIRA permissions. Have you asked
INFRA
whether it is possible to setup a Flin
I just found https://issues.apache.org/jira/browse/FLINK-11901
According to Chesnay, this is a release blocker.
Regards,
Timo
Am 13.03.19 um 09:48 schrieb Driesprong, Fokko:
-1 (non-binding)
I'd like to see both get into Flink 1.8:
https://github.com/apache/flink/pull/7547
Timo Walther created FLINK-11901:
Summary: Update the year in NOTICE files
Key: FLINK-11901
URL: https://issues.apache.org/jira/browse/FLINK-11901
Project: Flink
Issue Type: Bug
Timo Walther created FLINK-11890:
Summary: Replace Table API string-based expressions by a Java DSL
Key: FLINK-11890
URL: https://issues.apache.org/jira/browse/FLINK-11890
Project: Flink
Hi,
welcome to the Flink community! I you should already have contributor
permissions. Please also have a look at the contribution guidelines.
https://flink.apache.org/how-to-contribute.html
Thanks,
Timo
Am 09.03.19 um 10:05 schrieb hdxg1101300123:
Hi Guys,
I want to contribute to Apache
Hi,
welcome to the Flink community! I gave you contributor permissions.
Please also have a look at the contribution guidelines.
https://flink.apache.org/how-to-contribute.html
Thanks,
Timo
Am 11.03.19 um 07:25 schrieb chenkaibit:
Hi:
I want to contribute to Apache Flink.
Would you please
Timo Walther created FLINK-11844:
Summary: Simplify OVER window Table API classes
Key: FLINK-11844
URL: https://issues.apache.org/jira/browse/FLINK-11844
Project: Flink
Issue Type
Hi,
yes I also fully agree that it is time to write down all these implicit
convensions that we've learned throught the last years. The Flink
community is growing quite rapidly right now and we must ensure that the
same mistakes do not repeat again.
Keeping the number of dependencies low is
Timo Walther created FLINK-11785:
Summary: Replace case class Null(type) in Table API
Key: FLINK-11785
URL: https://issues.apache.org/jira/browse/FLINK-11785
Project: Flink
Issue Type
Hi,
welcome to the Flink community! I gave you contributor permissions.
Please also have a look at the contribution guidelines.
https://flink.apache.org/how-to-contribute.html
Thanks,
Timo
Am 28.02.19 um 08:27 schrieb hdxg1101300123:
Hi Guys,
I want to contribute to Apache Flink.
Would
Hi everyone,
as some of you might have noticed during the last weeks, the Flink
community grew quite a bit. A lot of people have applied for contributor
permissions and started working on issues, which is great for the growth
of Flink!
However, we've also observed that managing JIRA and
Timo Walther created FLINK-11728:
Summary: Deprecate CalciteConfig temporarily
Key: FLINK-11728
URL: https://issues.apache.org/jira/browse/FLINK-11728
Project: Flink
Issue Type: Improvement
Timo Walther created FLINK-11727:
Summary: JSON row format is not serializable
Key: FLINK-11727
URL: https://issues.apache.org/jira/browse/FLINK-11727
Project: Flink
Issue Type: Bug
Timo Walther created FLINK-11679:
Summary: Create Blink SQL planner and runtime modules
Key: FLINK-11679
URL: https://issues.apache.org/jira/browse/FLINK-11679
Project: Flink
Issue Type: New
+1
Thanks for the proposal.
Timo
Am 20.02.19 um 10:53 schrieb Robert Metzger:
Hey all,
As discussed in the last days, I'm proposing to clean up and reorganize our
JIRA tickets.
Chesnay proposed to approve the final proposal through a VOTE.
Here is the proposal:
t;E2E tests".
Infrastructure is not about fixing failing tests, which is what we
partially used this component for so far.
I don't believe you can get rid of the generic "Tests" component;
consider any changes to the `flink-test-utils-junit` module.
You propose deleting "Core&quo
Hi,
welcome to the Flink community. I gave you contributor permissions.
Please also have a look at the contribution guidelines.
https://flink.apache.org/how-to-contribute.html
Thanks,
Timo
Am 20.02.19 um 03:56 schrieb Yang Peng:
Hi,
I want to contribute to Apache Flink.
Would you please
Hi,
welcome to the Flink community. I gave you contributor permissions.
Please also have a look at the contribution guidelines.
https://flink.apache.org/how-to-contribute.html
Thanks,
Timo
Am 20.02.19 um 07:30 schrieb Tong:
Hi,
I want to contribute to Apache Flink.
Would you please give
Hi Robert,
thanks for starting this discussion. I was also about to suggest
splitting the `Table API & SQL` component because it contains already
more than 1000 issues.
My comments:
- Rename "SQL/Shell" to "SQL/Client" because the long-term goal might
not only be a CLI interface. I would
Timo Walther created FLINK-11543:
Summary: Type mismatch AssertionError in FilterJoinRule
Key: FLINK-11543
URL: https://issues.apache.org/jira/browse/FLINK-11543
Project: Flink
Issue Type
Hi Ramya,
at the first glance your pom.xml looks correct. What is the contents of
META-INF/services/org.apache.flink.table.factories.TableFactory? Is the
JsonXXXFactory listed there?
If not, maybe a Maven service file transformer is missing to collect all
table factories into the service
Timo Walther created FLINK-11535:
Summary: SQL Client jar does not contain table-api-java
Key: FLINK-11535
URL: https://issues.apache.org/jira/browse/FLINK-11535
Project: Flink
Issue Type
RTRIM is not the only solution. It is also possible to cast all literals
of a CASE WHEN statement to VARCHAR, right?
WHEN aggrcat = '0' AND botcode='r4' THEN CAST('monitoring' AS VARCHAR)
The common datatype of all branches should then be VARCHAR.
Regards,
Timo
Am 04.02.19 um 11:06
Timo Walther created FLINK-11493:
Summary: Finalize the Blink SQL merging efforts
Key: FLINK-11493
URL: https://issues.apache.org/jira/browse/FLINK-11493
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11492:
Summary: Support the full Blink SQL runtime
Key: FLINK-11492
URL: https://issues.apache.org/jira/browse/FLINK-11492
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11491:
Summary: Support all TPC-DS queries
Key: FLINK-11491
URL: https://issues.apache.org/jira/browse/FLINK-11491
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11490:
Summary: Add an initial batch runtime
Key: FLINK-11490
URL: https://issues.apache.org/jira/browse/FLINK-11490
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11489:
Summary: Add an initial Blink streaming runtime
Key: FLINK-11489
URL: https://issues.apache.org/jira/browse/FLINK-11489
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11488:
Summary: Merge a basic Blink planner framework
Key: FLINK-11488
URL: https://issues.apache.org/jira/browse/FLINK-11488
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11452:
Summary: Make the table planner pluggable
Key: FLINK-11452
URL: https://issues.apache.org/jira/browse/FLINK-11452
Project: Flink
Issue Type: Sub-task
Timo Walther created FLINK-11451:
Summary: Move *QueryConfig and TableDescriptor to
flink-table-api-java
Key: FLINK-11451
URL: https://issues.apache.org/jira/browse/FLINK-11451
Project: Flink
Timo Walther created FLINK-11450:
Summary: Port and move TableSource and TableSink to
flink-table-common
Key: FLINK-11450
URL: https://issues.apache.org/jira/browse/FLINK-11450
Project: Flink
Timo Walther created FLINK-11449:
Summary: Uncouple the Expression class from RexNodes
Key: FLINK-11449
URL: https://issues.apache.org/jira/browse/FLINK-11449
Project: Flink
Issue Type: New
Timo Walther created FLINK-11448:
Summary: Clean-up and prepare Table API to be uncoupled from table
core
Key: FLINK-11448
URL: https://issues.apache.org/jira/browse/FLINK-11448
Project: Flink
Timo Walther created FLINK-11447:
Summary: Deprecate "new Table(TableEnvironment, String)"
Key: FLINK-11447
URL: https://issues.apache.org/jira/browse/FLINK-11447
Project: Flink
Timo Walther created FLINK-11445:
Summary: Deprecate static methods in TableEnvironments
Key: FLINK-11445
URL: https://issues.apache.org/jira/browse/FLINK-11445
Project: Flink
Issue Type
Timo Walther created FLINK-11444:
Summary: Deprecate methods for uncoupling table API and table core
Key: FLINK-11444
URL: https://issues.apache.org/jira/browse/FLINK-11444
Project: Flink
to the JIRAs which will be opened !
Cheers,
Jincheng
Timo Walther 于2019年1月24日周四 下午9:06写道:
Hi everyone,
as Stephan already announced on the mailing list [1], the Flink
community will receive a big code contribution from Alibaba. The
flink-table module is one of the biggest parts that will receive
Timo Walther created FLINK-11439:
Summary: INSERT INTO flink_sql SELECT * FROM blink_sql
Key: FLINK-11439
URL: https://issues.apache.org/jira/browse/FLINK-11439
Project: Flink
Issue Type
Hi everyone,
as Stephan already announced on the mailing list [1], the Flink
community will receive a big code contribution from Alibaba. The
flink-table module is one of the biggest parts that will receive many
new features and major architectural improvements. Instead of waiting
until the
ts whether at this time we could even create a
source release of Blink given that we'd have to vet the code-base
first.
Even without source release we could still distribute jars, but would
not be allowed to advertise them to users as they do not constitute an
official release.
On 23.01.2019 11:41
Timo Walther wrote:
Hi Kurt,
I would not make the Blink's documentation visible to users or search
engines via a website. Otherwise this would communicate that Blink is an
official release. I would suggest to put the Blink docs into `/docs` and
people can build it with `./docs/build.sh -pi
Hi Kurt,
I would not make the Blink's documentation visible to users or search
engines via a website. Otherwise this would communicate that Blink is an
official release. I would suggest to put the Blink docs into `/docs` and
people can build it with `./docs/build.sh -pi` if there are
+1 for Stephan's suggestion. For example, SQL connectors have never been
part of the main distribution and nobody complained about this so far. I
think what is more important than a big dist bundle is a helpful
"Downloads" page where users can easily find available filesystems,
connectors,
Thanks for driving these efforts, Stephan! Great news that the Blink
code base will be available for everyone soon. I already got access to
it and the added functionality and improved architecture is impressive.
There will be nice additions to Flink.
I guess the Blink code base will be
I totally agree with Chesnay here. A bot just treats the symptoms but
not the cause.
Maybe this needs no immediate action but we as committers should aim for
a more honest communication. A lot of PRs have a reason for being stale
but instead of communicating this reason we just don't touch
Hi,
this is a known problem that occurs if you have big expressions. For
example, a big CASE WHEN clause. Currenlty, we only split by field not
within expressions. But this might be fixed soon as there is a PR
available [1].
As a workaround, use a UDF instead.
Regards,
Timo
[1]
Thanks for bringing up this discussion again. +1 for a bot solution.
However, we should discuss a good process for closing PRs.
In many cases, PRs are closed not because the contributor did not
respond but because no committer prioritizes the PR high enough. Or the
PR has issues that might
ive it now
before we convert it into a FLIP.
Thanks,
Timo
[1]
https://docs.google.com/document/d/1Y9it78yaUvbv4g572ZK_lZnZaAGjqwM_EhjdOv4yJtw/edit#
Am 07.01.19 um 13:51 schrieb Timo Walther:
Hi Eron,
thank you very much for the contributions. I merged the first little
bug fixes. For the remai
Timo Walther created FLINK-11273:
Summary: Queryable state (rocksdb) end-to-end test fails
Key: FLINK-11273
URL: https://issues.apache.org/jira/browse/FLINK-11273
Project: Flink
Issue Type
document/d/1Y9it78yaUvbv4g572ZK_lZnZaAGjqwM_EhjdOv4yJtw/edit#
Am 07.01.19 um 13:51 schrieb Timo Walther:
Hi Eron,
thank you very much for the contributions. I merged the first little
bug fixes. For the remaining PRs I think we can review and merge them
soon. As you said, the code is agnostic to the details of the
Exter
Hi Eron,
thank you very much for the contributions. I merged the first little bug
fixes. For the remaining PRs I think we can review and merge them soon.
As you said, the code is agnostic to the details of the ExternalCatalog
interface and I don't expect bigger merge conflicts in the near
and Stephan both mentioned that `common` would fit better in
our current naming scheme.
I will open a PR for FLIP-28 step 1 shortly and looking forward to feedback.
Thanks,
Timo
Am 11.12.18 um 09:10 schrieb Timo Walther:
Hi Aljoscha,
thanks for your feedback. I also don't like the fact that an API
+1
- manually checked the commit diff and could not spot any issues
- run mvn clean verify locally with success
- run a couple of e2e tests locally with success
Thanks,
Timo
Am 18.12.18 um 11:28 schrieb Chesnay Schepler:
FLINK-10874 and FLINK-10987 were fixed for 1.7.0 .
I will remove
+1
- manually checked the commit diff and could not spot any issues
- run mvn clean verify locally with success
- run a couple of e2e tests locally with success
Thanks,
Timo
Am 19.12.18 um 18:28 schrieb Aljoscha Krettek:
+1
- signatures/hashes are ok
- verified that the log contains no
+1
- manually checked the commit diff and could not sport any issues
- run mvn clean verify locally with success
- run a couple of e2e tests locally with success
Thanks,
Timo
Am 19.12.18 um 18:36 schrieb Aljoscha Krettek:
+1
- signatures/hashes are ok
- manually checked the logs after
Timo Walther created FLINK-11200:
Summary: Port DataView classes to flink-table-common
Key: FLINK-11200
URL: https://issues.apache.org/jira/browse/FLINK-11200
Project: Flink
Issue Type: Sub
Timo Walther created FLINK-11184:
Summary: Rework TableSource and TableSink interfaces
Key: FLINK-11184
URL: https://issues.apache.org/jira/browse/FLINK-11184
Project: Flink
Issue Type: New
Hi everyone,
I just noticed that FLINK-9555 [1] has been accidently merged to the
release-1.7 branch. How do we want to deal with that?
The Scala Shell is not a super crucial Flink feature. But this commit
does not only introduce a new feature and adds new dependencies but also
introduces a
Hi Tony,
I gave you contributor permissions. Please also have a look at our
contributor guidelines before you start working on issues:
https://flink.apache.org/contribute-code.html
Regards,
Timo
Am 14.12.18 um 03:01 schrieb 宋辛童(五藏):
Hi there,
Could anyone kindly give me the contributor
Hi Shengyang,
I gave you contributor permissions. Please also have a look at our
contributor guidelines:
https://flink.apache.org/contribute-code.html
Regards,
Timo
Am 14.12.18 um 11:47 schrieb Shengyang Sha:
Hi,
I've been interested in flink for a long time and have worked on a few
hen we can either continue the discussion of the future
improvements
here,
or create separate JIRAs for each item and discuss further in the
JIRA.
What do you guys think?
Shuyi
On Fri, Dec 7, 2018 at 7:54 AM Timo Walther
wrote:
Hi all,
I think we are making good progress. Thanks for a
I suggest we first agree on the MVP feature list and the MVP grammar. And
then we can either continue the discussion of the future improvements
here,
or create separate JIRAs for each item and discuss further in the JIRA.
What do you guys think?
Shuyi
On Fri, Dec 7, 2018 at 7:54 AM Timo Wal
Hi Dian,
I proposed a solution that should be backwards compatible and solves our
Maven dependency problems in the corresponding issue.
I'm happy about feedback.
Regards,
Timo
Am 11.12.18 um 11:23 schrieb fudian.fd:
Hi Timo,
Thanks a lot for your reply. I think the cause to this problem
also have to achieve this for the streaming API.
Best,
Aljoscha
On 29. Nov 2018, at 16:58, Timo Walther wrote:
Thanks for the feedback, everyone!
I created a FLIP for these efforts:
https://cwiki.apache.org/confluence/display/FLINK/FLIP-28%3A+Long-term+goal+of+making+flink-table+Scala-free
I
Hi Jincheng,
thanks for the proposal. I totally agree with the problem of having 3
StreamTableEnvironments and 3 BatchTableEnvironments. We also identified
this problem when doing Flink trainings and introductions to the Table &
SQL API.
Actually, @Dawid and I were already discussing to
ot;) "
From the point of my view, this ddl is invalid because the primary key
constraint already references two columns but types unseen.
And Xuefu pointed a important matching problem, so let's put schema
derivation as a follow-up extension ?
Timo Walther 于2018年12月6日周四 下午6:05写道:
Hi,
welcome to the Flink community. If you give me your JIRA username, I can
give your contributor permissions.
Thanks,
Timo
Am 06.12.18 um 12:12 schrieb shen lei:
Hi All,
Could you give me the permission to solve the flink's jira issues? I
am interested in Flink, and I want to find
I like your `contract name` proposal,
e.g., `WITH (format.type = avro)`, the framework can recognize some
`contract name` like `format.type`, `connector.type` and etc.
And also derive the table schema from an existing schema file can be handy
especially one with too many table columns.
Regards
n DDL
The main differences from two DDL docs (sth maybe missed, welcome to
point
out):
*(1.3) watermark*: this is the main and the most important
difference,
it
would be great if @Timo Walther @Fabian Hueske
give some feedbacks.
(1.1) Type definition:
(a) Should VARCHAR carry a len
Hi everyone,
thanks for starting the discussion. In general, I like the idea of
making Flink SQL queries more concise.
However, I don't like to diverge from standard SQL. So far, we managed
to add a lot of operators and functionality while being standard
compliant. Personally, I don't see a
Timo Walther created FLINK-11068:
Summary: Port Table class to Java
Key: FLINK-11068
URL: https://issues.apache.org/jira/browse/FLINK-11068
Project: Flink
Issue Type: New Feature
Timo Walther created FLINK-11067:
Summary: Port TableEnvironments to Java
Key: FLINK-11067
URL: https://issues.apache.org/jira/browse/FLINK-11067
Project: Flink
Issue Type: Sub-task
801 - 900 of 1349 matches
Mail list logo