Thanks for bringing this up Dian.
Since python 2.7 support was added in 1.9.0 and would be EOL near the
planned release time for 1.10, I could see a good reason to take option 1.
Please remember to add an explicit release note and would be better to send
a notification in user ML about the plan
Hi Jincheng, Dian and Jeff,
Thank you for your replies and comments in google doc! I think we have come to
an agreement on the desgin doc with only minor changes as follow:
- Using the API "set_python_executable" instead of "set_environment_variable"
to set the python executable file path.
Thanks Jincheng and Till, then let's keep on verifying the RC1.
Best,
Jark
On Wed, 9 Oct 2019 at 11:00, jincheng sun wrote:
> I think we should create the new RC when we find the blocker issues.
> We can looking forward the other check result, we can add the fix of
> FLINK-14315 in to 1.9.1
+1
Hequn Cheng 于2019年10月9日周三 上午11:07写道:
> Hi Dian,
>
> +1 to drop Python 2 directly.
>
> Just as @jincheng said, things would be more complicated if we are going to
> support python UDFs.
> The python UDFs will introduce a lot of python dependencies which will also
> drop the support of Python
Caizhi Weng created FLINK-14348:
---
Summary: YarnFileStageTestS3ITCase. testRecursiveUploadForYarnS3a
fails to delete files
Key: FLINK-14348
URL: https://issues.apache.org/jira/browse/FLINK-14348
Caizhi Weng created FLINK-14347:
---
Summary: YARNSessionFIFOITCase.checkForProhibitedLogContents found
a log with prohibited string
Key: FLINK-14347
URL: https://issues.apache.org/jira/browse/FLINK-14347
Hi Dian,
+1 to drop Python 2 directly.
Just as @jincheng said, things would be more complicated if we are going to
support python UDFs.
The python UDFs will introduce a lot of python dependencies which will also
drop the support of Python 2, such as beam, pandas, pyarrow, etc.
Given this and
I think we should create the new RC when we find the blocker issues.
We can looking forward the other check result, we can add the fix of
FLINK-14315 in to 1.9.1 only we find the blockers.
Best,
Jincheng
Till Rohrmann 于2019年10月8日周二 下午8:20写道:
> FLINK-14315 has been merged into the release-1.9
Thanks Timo for your pretty nice proposal, big +1 to the FLIP. Left some
minor comments.
A minor concern about flink-planner, precision things maybe cannot be
supported.
Best,
Jingsong Lee
On Tue, Oct 8, 2019 at 5:58 PM zha...@lenovocloud.com <
zha...@lenovocloud.com> wrote:
> unsubscribe
>
>
Hi, Flink Team,
According to the discussion, I assume that we are now agree that running
cron job for ARM at this moment. I have ran POC e2e test in OpenLab for
some days[1]. It includes:
flink-end-to-end-test-part1
split_checkpoints.sh and split_sticky.sh
flink-end-to-end-test-part2
Hi Dian,
Thanks for bringing this discussion!
In Flink 1.9 we only add Python Table API mapping to Java Table API(without
Python UDFs), there no special requirements for Python version, so we add
python 2,7 support. But for Flink 1.10, we add the Python UDFs support,
i.e., user will add more
+1
On Tue, Oct 8, 2019 at 7:00 AM Aljoscha Krettek wrote:
> +1
>
> > On 8. Oct 2019, at 15:35, Timo Walther wrote:
> >
> > +1
> >
> > Thanks for driving these efforts,
> > Timo
> >
> > On 07.10.19 10:10, Dawid Wysakowicz wrote:
> >> +1 for the FLIP.
> >>
> >> Best,
> >>
> >> Dawid
> >>
> >> On
Hi everyone,
I would like to propose to drop Python 2 support(Currently Python 2.7, 3.5,
3.6, 3.7 are all supported in Flink) as it's coming to an end at Jan 1, 2020
[1]. A lot of projects [2][3][4] has already stated or are planning to drop
Python 2 support.
The benefits of dropping Python 2
+1
> On 8. Oct 2019, at 15:35, Timo Walther wrote:
>
> +1
>
> Thanks for driving these efforts,
> Timo
>
> On 07.10.19 10:10, Dawid Wysakowicz wrote:
>> +1 for the FLIP.
>>
>> Best,
>>
>> Dawid
>>
>> On 07/10/2019 08:45, Bowen Li wrote:
>>> Hi all,
>>>
>>> I'd like to start a new voting
+1
Thanks for driving these efforts,
Timo
On 07.10.19 10:10, Dawid Wysakowicz wrote:
+1 for the FLIP.
Best,
Dawid
On 07/10/2019 08:45, Bowen Li wrote:
Hi all,
I'd like to start a new voting thread for FLIP-57 [1] on its latest status
despite [2], and we've reached consensus in [2] and
FLINK-14315 has been merged into the release-1.9 branch. I've marked the
fix version of this ticket as 1.9.2. If we should create a new RC, then we
could include this fix. If this happens, then we need to update the fix
version to 1.9.1.
Cheers,
Till
On Tue, Oct 8, 2019 at 1:51 PM Till Rohrmann
Roman Grebennikov created FLINK-14346:
-
Summary: Performance issue with StringSerializer
Key: FLINK-14346
URL: https://issues.apache.org/jira/browse/FLINK-14346
Project: Flink
Issue
If people already spent time on verifying the current RC I would also be
fine to release the fix for FLINK-14315 with Flink 1.9.2.
I will try to merge the PR as soon as possible. When I close the ticket, I
will update the fix version field to 1.9.2.
Cheers,
Till
On Tue, Oct 8, 2019 at 4:43 AM
unsubscribe
zha...@lenovocloud.com
From: Jark Wu
Date: 2019-10-08 17:29
To: dev
Subject: Re: [DISCUSS] FLIP-65: New type inference for Table API UDFs
Hi Timo,
Thanks for the proposal, a big +1 to the FLIP, especially this enables the
unified `TableEnvironment.registerFunction()`.
I think
Hi Timo,
Thanks for the proposal, a big +1 to the FLIP, especially this enables the
unified `TableEnvironment.registerFunction()`.
I think the design documentation is pretty good enough, I only left some
minor comments there.
Best,
Jark
On Fri, 4 Oct 2019 at 23:54, Timo Walther wrote:
> Hi
Chesnay Schepler created FLINK-14345:
Summary: Snapshot deployments may fail due to MapR HTTPS issue
Key: FLINK-14345
URL: https://issues.apache.org/jira/browse/FLINK-14345
Project: Flink
Biao Liu created FLINK-14344:
Summary: Snapshot master hook state asynchronously
Key: FLINK-14344
URL: https://issues.apache.org/jira/browse/FLINK-14344
Project: Flink
Issue Type: Sub-task
Zili Chen created FLINK-14343:
-
Summary: Remove uncompleted YARNHighAvailabilityService
Key: FLINK-14343
URL: https://issues.apache.org/jira/browse/FLINK-14343
Project: Flink
Issue Type: Task
23 matches
Mail list logo