Re: [DISCUSS] Releasing Flink 1.7.2

2019-02-08 Thread Ufuk Celebi
+1

@Gordon: There has been no further review of the PR that Shuyi linked
(https://github.com/apache/flink/pull/7356).  Do you plan to block
1.7.2 on this or rather not?

– Ufuk

On Fri, Feb 8, 2019 at 10:35 PM Thomas Weise  wrote:
>
> +1
>
>
> On Fri, Feb 8, 2019 at 6:14 AM jincheng sun 
> wrote:
>
> > +1 for starting Flink 1.7.2 release next week.
> >
> > Cheers,
> > Jincheng
> >
> > Tzu-Li (Gordon) Tai  于2019年2月5日周二 下午11:32写道:
> >
> > > Hi Flink devs,
> > >
> > > What do you think about releasing Flink 1.7.2 soon?
> > >
> > > We already have some critical fixes in the release-1.7 branch:
> > > - FLINK-11207: security vulnerability with currently used Apache
> > > commons-compress version
> > > - FLINK-11419: restore issue with StreamingFileSink
> > > - FLINK-11436: restore issue with Flink's AvroSerializer
> > > - FLINK-10761: potential deadlock with metrics system
> > > - FLINK-10774: connection leak in FlinkKafkaConsumer
> > > - FLINK-10848: problem with resource allocation in YARN mode
> > >
> > > Please let me know what you think. Ideally, we can kick off the release
> > > vote for the first RC early next week.
> > > If there are some other critical fixes for 1.7.2 that are almost
> > completed
> > > (already have a PR opened and review is in progress), please let me know
> > > here by the end of the week to account for it for the 1.7.2 release.
> > >
> > > Cheers,
> > > Gordon
> > >
> >


[jira] [Created] (FLINK-11569) Row type does not serialize in to readable format when invoke "toString" method

2019-02-08 Thread Rong Rong (JIRA)
Rong Rong created FLINK-11569:
-

 Summary: Row type does not serialize in to readable format when 
invoke "toString" method
 Key: FLINK-11569
 URL: https://issues.apache.org/jira/browse/FLINK-11569
 Project: Flink
  Issue Type: Bug
  Components: Type Serialization System
Reporter: Rong Rong
Assignee: Rong Rong


Seems like the "toString" method for Row type is only concatenating all fields 
using COMMA ",". However it does not wrap the entire Row in some type of 
encapsulation, for example "()". This results in nested Row being serialized as 
if they are all in one level.

For example: {{Row.of("a", 1, Row.of("b", 2))}} is printed out as 
{{"a",1,"b",2}}. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: [DISCUSS] Releasing Flink 1.7.2

2019-02-08 Thread Thomas Weise
+1


On Fri, Feb 8, 2019 at 6:14 AM jincheng sun 
wrote:

> +1 for starting Flink 1.7.2 release next week.
>
> Cheers,
> Jincheng
>
> Tzu-Li (Gordon) Tai  于2019年2月5日周二 下午11:32写道:
>
> > Hi Flink devs,
> >
> > What do you think about releasing Flink 1.7.2 soon?
> >
> > We already have some critical fixes in the release-1.7 branch:
> > - FLINK-11207: security vulnerability with currently used Apache
> > commons-compress version
> > - FLINK-11419: restore issue with StreamingFileSink
> > - FLINK-11436: restore issue with Flink's AvroSerializer
> > - FLINK-10761: potential deadlock with metrics system
> > - FLINK-10774: connection leak in FlinkKafkaConsumer
> > - FLINK-10848: problem with resource allocation in YARN mode
> >
> > Please let me know what you think. Ideally, we can kick off the release
> > vote for the first RC early next week.
> > If there are some other critical fixes for 1.7.2 that are almost
> completed
> > (already have a PR opened and review is in progress), please let me know
> > here by the end of the week to account for it for the 1.7.2 release.
> >
> > Cheers,
> > Gordon
> >
>


[jira] [Created] (FLINK-11568) Exception in Kinesis ShardConsumer hidden by InterruptedException

2019-02-08 Thread Shannon Carey (JIRA)
Shannon Carey created FLINK-11568:
-

 Summary: Exception in Kinesis ShardConsumer hidden by 
InterruptedException 
 Key: FLINK-11568
 URL: https://issues.apache.org/jira/browse/FLINK-11568
 Project: Flink
  Issue Type: Improvement
  Components: Kinesis Connector
Affects Versions: 1.6.2
Reporter: Shannon Carey
Assignee: Shannon Carey


When the Kinesis ShardConsumer encounters an exception, for example due to a 
problem in the Deserializer, the root cause exception is often hidden by a 
non-informative InterruptedException caused by the FlinkKinesisConsumer thread 
being interrupted.

Ideally, the root cause exception would be preserved and thrown so that the 
logs contain enough information to diagnose the issue.

This probably affects all versions.

Here's an example of a log message with the unhelpful InterruptedException:
{code:java}
2019-02-05 13:29:31:383 thread=Source: Custom Source -> Filter -> Map -> Sink: 
Unnamed (1/8), level=WARN, 
logger=org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer, 
message="Error while closing Kinesis data fetcher"
java.lang.InterruptedException: sleep interrupted
at java.lang.Thread.sleep(Native Method)
at 
org.apache.flink.streaming.connectors.kinesis.internals.KinesisDataFetcher.awaitTermination(KinesisDataFetcher.java:450)
at 
org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.cancel(FlinkKinesisConsumer.java:314)
at 
org.apache.flink.streaming.connectors.kinesis.FlinkKinesisConsumer.close(FlinkKinesisConsumer.java:323)
at 
org.apache.flink.api.common.functions.util.FunctionUtils.closeFunction(FunctionUtils.java:43)
at 
org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.dispose(AbstractUdfStreamOperator.java:117)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.disposeAllOperators(StreamTask.java:477)
at 
org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:378)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
at java.lang.Thread.run(Thread.java:745)
{code}
And here's an example of the real exception that we're actually interested in, 
which is stored inside KinesisDataFetcher#error, but is not thrown or logged:
{code:java}
org.apache.avro.io.parsing.Symbol$Alternative.getSymbol(Symbol.java:416)
org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:290)
org.apache.avro.io.parsing.Parser.advance(Parser.java:88)
org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:267)
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:178)
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)
org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:240)
org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:230)
org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:174)
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:144)
org.apache.flink.formats.avro.AvroDeserializationSchema.deserialize(AvroDeserializationSchema.java:135)
org.apache.flink.streaming.connectors.kinesis.serialization.KinesisDeserializationSchemaWrapper.deserialize(KinesisDeserializationSchemaWrapper.java:44)
org.apache.flink.streaming.connectors.kinesis.internals.ShardConsumer.deserializeRecordForCollectionAndUpdateState(ShardConsumer.java:332)
org.apache.flink.streaming.connectors.kinesis.internals.ShardConsumer.run(ShardConsumer.java:231)
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266)
java.util.concurrent.FutureTask.run(FutureTask.java)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11564) Translate "How To Contribute" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11564:
---

 Summary: Translate "How To Contribute" page into Chinese
 Key: FLINK-11564
 URL: https://issues.apache.org/jira/browse/FLINK-11564
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu



Translate "How To Contribute" page into Chinese.

The markdown file is located in: flink-web/how-to-contribute.zh.md
The url link is: https://flink.apache.org/zh/how-to-contribute.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11567) Translate "How to Review a Pull Request" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11567:
---

 Summary: Translate "How to Review a Pull Request" page into Chinese
 Key: FLINK-11567
 URL: https://issues.apache.org/jira/browse/FLINK-11567
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "How to Review a Pull Request" page into Chinese.

The markdown file is located in: flink-web/reviewing-prs.zh.md
The url link is: https://flink.apache.org/zh/reviewing-prs.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11565) Translate "Improving the Website" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11565:
---

 Summary: Translate "Improving the Website" page into Chinese
 Key: FLINK-11565
 URL: https://issues.apache.org/jira/browse/FLINK-11565
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Improving the Website" page into Chinese.

The markdown file is located in: flink-web/improve-website.zh.md
The url link is: https://flink.apache.org/zh/improve-website.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11566) Translate "Powered by Flink" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11566:
---

 Summary: Translate "Powered by Flink" page into Chinese
 Key: FLINK-11566
 URL: https://issues.apache.org/jira/browse/FLINK-11566
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Powered by Flink" page into Chinese.

The markdown file is located in: flink-web/poweredby.zh.md
The url link is: https://flink.apache.org/zh/poweredby.html

Please adjust the links in the page to Chinese pages when translating. 









--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11563) Translate "Getting Help" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11563:
---

 Summary: Translate "Getting Help" page into Chinese
 Key: FLINK-11563
 URL: https://issues.apache.org/jira/browse/FLINK-11563
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Getting Help" page into Chinese.

The markdown file is located in: flink-web/gettinghelp.zh.md
The url link is: https://flink.apache.org/zh/gettinghelp.html

Please adjust the links in the page to Chinese pages when translating. 









--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11562) Translate "Flink Operations" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11562:
---

 Summary: Translate "Flink Operations" page into Chinese
 Key: FLINK-11562
 URL: https://issues.apache.org/jira/browse/FLINK-11562
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Flink Operations" page into Chinese.

The markdown file is located in: flink-web/flink-operations.zh.md
The url link is: https://flink.apache.org/zh/flink-operations.html

Please adjust the links in the page to Chinese pages when translating. 









--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11561) Translate "Flink Architecture" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11561:
---

 Summary: Translate "Flink Architecture" page into Chinese
 Key: FLINK-11561
 URL: https://issues.apache.org/jira/browse/FLINK-11561
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Flink Architecture" page into Chinese.

The markdown file is located in: flink-web/flink-architecture.zh.md
The url link is: https://flink.apache.org/zh/flink-architecture.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11557) Translate "Downloads" page into Chinese.

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11557:
---

 Summary: Translate "Downloads" page into Chinese.
 Key: FLINK-11557
 URL: https://issues.apache.org/jira/browse/FLINK-11557
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Downloads" page into Chinese.

The markdown file is located in: flink-web/downloads.zh.md
The url link is: https://flink.apache.org/zh/downloads.html

Please adjust the links in the page to Chinese pages when translating. 







--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11560) Translate "Flink Applications" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11560:
---

 Summary: Translate "Flink Applications" page into Chinese
 Key: FLINK-11560
 URL: https://issues.apache.org/jira/browse/FLINK-11560
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Flink Applications" page into Chinese.

The markdown file is located in: flink-web/flink-applications.zh.md
The url link is: https://flink.apache.org/zh/flink-applications.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11559) Translate "FAQ" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11559:
---

 Summary: Translate "FAQ" page into Chinese
 Key: FLINK-11559
 URL: https://issues.apache.org/jira/browse/FLINK-11559
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "FAQ" page into Chinese

The markdown file is located in: flink-web/faq.zh.md
The url link is: https://flink.apache.org/zh/faq.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11554) Translate the "Community & Project Info" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11554:
---

 Summary: Translate the "Community & Project Info" page into Chinese
 Key: FLINK-11554
 URL: https://issues.apache.org/jira/browse/FLINK-11554
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Community & Project Info" page into Chinese.

The markdown file is located in: flink-web/community.zh.md
The url link is: https://flink.apache.org/zh/community.html

Please adjust the links in the page to Chinese pages when translating. 







--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11558) Translate "Ecosystem" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11558:
---

 Summary: Translate "Ecosystem" page into Chinese
 Key: FLINK-11558
 URL: https://issues.apache.org/jira/browse/FLINK-11558
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Ecosystem" page into Chinese.

The markdown file is located in: flink-web/ecosystem.zh.md
The url link is: https://flink.apache.org/zh/ecosystem.html

Please adjust the links in the page to Chinese pages when translating. 









--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11556) Translate "Contribute Documentation" page into Chinese.

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11556:
---

 Summary: Translate "Contribute Documentation" page into Chinese.
 Key: FLINK-11556
 URL: https://issues.apache.org/jira/browse/FLINK-11556
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Contribute Documentation" page into Chinese.

The markdown file is located in: flink-web/contribute-documentation.zh.md
The url link is: https://flink.apache.org/zh/contribute-documentation.html

Please adjust the links in the page to Chinese pages when translating. 









--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Created] (FLINK-11555) Translate "Contributing Code" page into Chinese

2019-02-08 Thread Jark Wu (JIRA)
Jark Wu created FLINK-11555:
---

 Summary: Translate "Contributing Code" page into Chinese
 Key: FLINK-11555
 URL: https://issues.apache.org/jira/browse/FLINK-11555
 Project: Flink
  Issue Type: Sub-task
  Components: chinese-translation, Project Website
Reporter: Jark Wu


Translate "Contributing Code" page into Chinese.

The markdown file is located in: flink-web/contribute-code.zh.md
The url link is: https://flink.apache.org/zh/contribute-code.html

Please adjust the links in the page to Chinese pages when translating. 








--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: [VOTE] Release flink-shaded 6.0, release candidate 1

2019-02-08 Thread jincheng sun
+1 (non-binding)

-  Build from source archive succeed.

Here only one minor question is why add `release-` prefix to tag this time?
(not block the release) :)

Cheers,
Jincheng

Till Rohrmann  于2019年2月8日周五 下午10:13写道:

> +1 (binding)
>
> - built from source archive
> - verified pom changes between 5.0 and 6.0
>
> One minor note, the release tag seems to be different compared to the
> previous tags.
>
> Cheers,
> Till
>
> On Fri, Feb 8, 2019 at 2:30 PM Fabian Hueske  wrote:
>
> > Thank you Chesnay!
> >
> > * Checked the changes since the 5.0 release and didn't find anything
> > surprising.
> > * Signatures and hash are correct
> > * build from source archive passed
> >
> > +1 (binding)
> >
> > Best, Fabian
> >
> > Am Do., 7. Feb. 2019 um 09:48 Uhr schrieb Nico Kruber <
> > n...@da-platform.com
> > >:
> >
> > > +1,
> > >
> > > - Checked signature and hashes
> > > - Built locally from RC tgz
> > > - verified new Netty version working using the maven repo
> > > -> https://travis-ci.org/apache/flink/builds/489628017
> > >
> > > On 06/02/2019 13:59, Chesnay Schepler wrote:
> > > > Hi everyone,
> > > > Please review and vote on the release candidate #1 for the version
> 6.0,
> > > > as follows:
> > > > [ ] +1, Approve the release
> > > > [ ] -1, Do not approve the release (please provide specific comments)
> > > >
> > > >
> > > > The complete staging area is available for your review, which
> includes:
> > > > * JIRA release notes [1],
> > > > * the official Apache source release to be deployed to
> dist.apache.org
> > > > [2], which are signed with the key with fingerprint 11D464BA [3],
> > > > * all artifacts to be deployed to the Maven Central Repository [4],
> > > > * source code tag "release-6.0-rc1" [5].
> > > > * website pull request listing the new release  [6].
> > > >
> > > > The vote will be open for at least 72 hours. It is adopted by
> majority
> > > > approval, with at least 3 PMC affirmative votes.
> > > >
> > > > Thanks,
> > > > Chesnay
> > > >
> > > > [1]
> > > >
> > >
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12344544
> > > >
> > > > [2]
> https://dist.apache.org/repos/dist/dev/flink/flink-shaded-6.0-rc1/
> > > > [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > > > [4]
> > > https://repository.apache.org/content/repositories/orgapacheflink-1204
> > > > [5]
> > > >
> > >
> >
> https://gitbox.apache.org/repos/asf?p=flink-shaded.git;a=tag;h=refs/tags/release-6.0-rc1
> > > >
> > > > [6] https://github.com/apache/flink-web/pull/154
> > > >
> > >
> > > --
> > > Nico Kruber | Solutions Architect
> > > --
> > > Join Flink Forward - The Apache Flink Conference
> > > Stream Processing | Event Driven | Real Time
> > >
> > > --
> > > Data Artisans GmbH | Stresemannstr. 121A,10963 Berlin, Germany
> > > data Artisans, Inc. | 1161 Mission Street, San Francisco, CA-94103, USA
> > > --
> > > Data Artisans GmbH
> > > Registered at Amtsgericht Charlottenburg: HRB 158244 B
> > > Managing Directors: Dr. Kostas Tzoumas, Dr. Stephan Ewen
> > >
> > >
> >
>


Re: Flink release schedule

2019-02-08 Thread Ufuk Celebi
I like the idea of having a road map, but I think that's a separate
discussion. I'd be happy to chime in a separate discuss thread if you
want to drive this Fabian/Robert. Now that I'm re-reading my initial
email, I actually think it sounded a lot like a proposal to add more
information to the website. Sorry about that. My main question was
actually the following:

- What is the planned feature freeze date (and planned release date)
for the 1.8 release?

Best,

Ufuk

On Fri, Feb 8, 2019 at 2:38 PM Fabian Hueske  wrote:
>
> I like the idea of a roadmap that is kept up-to-date.
> We've tried that in the past a couple of times but they became outdated 
> rather quickly.
> We've also had a some threads on the dev mailing list to discuss the scope & 
> features of next releases.
>
> We could discuss and update the roadmap after every release (or even feature 
> freeze).
> IMO, it would be good to have two things on in the document.
>
> * a long-term vision (~1 year) where the project is heading to
> * a roadmap for the next release (~3 months)
>
> Cheers, Fabian
>
> Am Di., 5. Feb. 2019 um 16:55 Uhr schrieb Robert Metzger 
> :
>>
>> Hey Ufuk,
>>
>> I agree! We should have regular discussions here on the list for scoping
>> releases, and then put this to the website.
>> The Beam project has a pretty nice roadmap [1] available on their site.
>> I actually don't really know if there's an agreed upon release schedule in
>> the Flink community (I've not been following the dev@ list closely
>> recently).
>> I'll try to find somebody who knows a bit more about the recent Flink
>> community developments to put together a first draft for such a roadmap.
>>
>> [1] https://beam.apache.org/roadmap/
>>
>> On Mon, Feb 4, 2019 at 9:29 AM Ufuk Celebi  wrote:
>>
>> > Hey devs,
>> >
>> > I've been only following Flink dev loosely in the last couple of
>> > months and could not find the following information:
>> >
>> > What's the schedule for the next release? In particular, when is the
>> > planned feature freeze and when is the planned release date?
>> >
>> > I think this can be valuable information for both devs and users. What
>> > do you think about adding this information to the Flink website? It
>> > does not have to be exact dates, but general information such as "we
>> > do time based releases every X months", "we support the last two minor
>> > releases with patches", etc. I think this page [1] would be a good
>> > target for this information.
>> >
>> > Best,
>> >
>> > Ufuk
>> >
>> >
>> > [1] https://flink.apache.org/community.html
>> >


Re: [DISCUSS] Releasing Flink 1.7.2

2019-02-08 Thread jincheng sun
+1 for starting Flink 1.7.2 release next week.

Cheers,
Jincheng

Tzu-Li (Gordon) Tai  于2019年2月5日周二 下午11:32写道:

> Hi Flink devs,
>
> What do you think about releasing Flink 1.7.2 soon?
>
> We already have some critical fixes in the release-1.7 branch:
> - FLINK-11207: security vulnerability with currently used Apache
> commons-compress version
> - FLINK-11419: restore issue with StreamingFileSink
> - FLINK-11436: restore issue with Flink's AvroSerializer
> - FLINK-10761: potential deadlock with metrics system
> - FLINK-10774: connection leak in FlinkKafkaConsumer
> - FLINK-10848: problem with resource allocation in YARN mode
>
> Please let me know what you think. Ideally, we can kick off the release
> vote for the first RC early next week.
> If there are some other critical fixes for 1.7.2 that are almost completed
> (already have a PR opened and review is in progress), please let me know
> here by the end of the week to account for it for the 1.7.2 release.
>
> Cheers,
> Gordon
>


Re: [VOTE] Release flink-shaded 6.0, release candidate 1

2019-02-08 Thread Till Rohrmann
+1 (binding)

- built from source archive
- verified pom changes between 5.0 and 6.0

One minor note, the release tag seems to be different compared to the
previous tags.

Cheers,
Till

On Fri, Feb 8, 2019 at 2:30 PM Fabian Hueske  wrote:

> Thank you Chesnay!
>
> * Checked the changes since the 5.0 release and didn't find anything
> surprising.
> * Signatures and hash are correct
> * build from source archive passed
>
> +1 (binding)
>
> Best, Fabian
>
> Am Do., 7. Feb. 2019 um 09:48 Uhr schrieb Nico Kruber <
> n...@da-platform.com
> >:
>
> > +1,
> >
> > - Checked signature and hashes
> > - Built locally from RC tgz
> > - verified new Netty version working using the maven repo
> > -> https://travis-ci.org/apache/flink/builds/489628017
> >
> > On 06/02/2019 13:59, Chesnay Schepler wrote:
> > > Hi everyone,
> > > Please review and vote on the release candidate #1 for the version 6.0,
> > > as follows:
> > > [ ] +1, Approve the release
> > > [ ] -1, Do not approve the release (please provide specific comments)
> > >
> > >
> > > The complete staging area is available for your review, which includes:
> > > * JIRA release notes [1],
> > > * the official Apache source release to be deployed to dist.apache.org
> > > [2], which are signed with the key with fingerprint 11D464BA [3],
> > > * all artifacts to be deployed to the Maven Central Repository [4],
> > > * source code tag "release-6.0-rc1" [5].
> > > * website pull request listing the new release  [6].
> > >
> > > The vote will be open for at least 72 hours. It is adopted by majority
> > > approval, with at least 3 PMC affirmative votes.
> > >
> > > Thanks,
> > > Chesnay
> > >
> > > [1]
> > >
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12344544
> > >
> > > [2] https://dist.apache.org/repos/dist/dev/flink/flink-shaded-6.0-rc1/
> > > [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > > [4]
> > https://repository.apache.org/content/repositories/orgapacheflink-1204
> > > [5]
> > >
> >
> https://gitbox.apache.org/repos/asf?p=flink-shaded.git;a=tag;h=refs/tags/release-6.0-rc1
> > >
> > > [6] https://github.com/apache/flink-web/pull/154
> > >
> >
> > --
> > Nico Kruber | Solutions Architect
> > --
> > Join Flink Forward - The Apache Flink Conference
> > Stream Processing | Event Driven | Real Time
> >
> > --
> > Data Artisans GmbH | Stresemannstr. 121A,10963 Berlin, Germany
> > data Artisans, Inc. | 1161 Mission Street, San Francisco, CA-94103, USA
> > --
> > Data Artisans GmbH
> > Registered at Amtsgericht Charlottenburg: HRB 158244 B
> > Managing Directors: Dr. Kostas Tzoumas, Dr. Stephan Ewen
> >
> >
>


Re: [DISCUSS] Releasing Flink 1.7.2

2019-02-08 Thread Till Rohrmann
+1 for starting this release next week.

On Thu, Feb 7, 2019 at 11:50 AM Chesnay Schepler  wrote:

> +1 to start the release next week, we've accumulated a nice set of
> changes and it's been more than a month since 1.7.1 came out.
>
> On 05.02.2019 16:32, Tzu-Li (Gordon) Tai wrote:
> > Hi Flink devs,
> >
> > What do you think about releasing Flink 1.7.2 soon?
> >
> > We already have some critical fixes in the release-1.7 branch:
> > - FLINK-11207: security vulnerability with currently used Apache
> > commons-compress version
> > - FLINK-11419: restore issue with StreamingFileSink
> > - FLINK-11436: restore issue with Flink's AvroSerializer
> > - FLINK-10761: potential deadlock with metrics system
> > - FLINK-10774: connection leak in FlinkKafkaConsumer
> > - FLINK-10848: problem with resource allocation in YARN mode
> >
> > Please let me know what you think. Ideally, we can kick off the release
> > vote for the first RC early next week.
> > If there are some other critical fixes for 1.7.2 that are almost
> completed
> > (already have a PR opened and review is in progress), please let me know
> > here by the end of the week to account for it for the 1.7.2 release.
> >
> > Cheers,
> > Gordon
> >
>
>


Re: [DISCUSS] Clean up and reorganize the JIRA components

2019-02-08 Thread Chesnay Schepler

Some concerns:

Travis and build system / release system are entirely different. I would 
even keep the release system away from the build-system, as it is more 
about the release scripts and documentation, while the latter is about 
maven. Actually I'd just rename build-system to maven.


Control Plane is a term I've never heard before in this context; I'd 
replace it with Coordination.


The "Documentation" descriptions refers to it as a "Fallback component". 
In other words, if I make a change to the metrics documentation I 
shouldn't use this component any more?


I don't see the benefit of a `Misc` major category. I'd attribute 
everything that doesn't have a major category implicitly to "Misc".


Not a fan of a generalized "Legacy components" category; this seems 
unnecessary. It's also a bit weird going forward as we'd have to touch 
every JIRA for a component if we drop it.


How come gelly/CEP don't have a Major category (libraries?)

"End to end infrastructure" is not equivalent to "E2E tests". 
Infrastructure is not about fixing failing tests, which is what we 
partially used this component for so far.


I don't believe you can get rid of the generic "Tests" component; 
consider any changes to the `flink-test-utils-junit` module.


You propose deleting "Core" and "Configuration" but haven't listed any 
migration paths.


If there's a API / Python category there should also be a API / Scala 
category. This could also include the shala-shell. Note that the 
existing Scala API category is not mentioned anywhere in the document.


How do you actually want to do the migration?

On 08.02.2019 13:13, Timo Walther wrote:

Hi Robert,

thanks for starting this discussion. I was also about to suggest 
splitting the `Table API & SQL` component because it contains already 
more than 1000 issues.


My comments:

- Rename "SQL/Shell" to "SQL/Client" because the long-term goal might 
not only be a CLI interface. I would keep the generic name "SQL 
Client" for now. This is also what is written in FLIPs, presentations, 
and documentation.
- Rename "SQL/Query Planner" to "SQL/Planner" a query is read-only 
operation but we support things like INSERT INTO etc.. Planner is more 
generic.
- Rename "Gelly" to "Graph Processing". New users don't know what 
Gelly means. This is the only component that has a "feature name". I 
don't know if we want to stick with that in the future.
- Not sure about this: Introduce a "SQL/Connectors"? Because SQL 
connectors are tightly bound to SQL internals but also to the 
connector itself.
- Rename "Connectors/HCatalog" to "Connectors/Hive". This name is more 
generic and reflects the efforts about Hive Metastore and catalog 
integration that is currenlty taking place.


Thanks,
Timo


Am 08.02.19 um 12:39 schrieb Robert Metzger:

Hi all,

I am currently trying to improve how the Flink community is handling
incoming pull requests and JIRA tickets.

I've looked at how other big communities are handling such a high 
number of
contributions, and I found that many are using GitHub labels 
extensively.
An integral part of the label use is to tag PRs with the component / 
area
they belong to. I think the most obvious and logical way of tagging 
the PRs

is by using the JIRA components. This will force us to keep the JIRA
tickets well-organized, if we want the PRs to be organized :)
I will soon start a separate discussion for the GitHub labels.

Let's first discuss the JIRA components.

I've created the following Wiki page with my proposal of the new 
component,

and how to migrate from the existing components:
https://cwiki.apache.org/confluence/display/FLINK/Proposal+for+new+JIRA+Components 



Please comment here or directly in the Wiki to let me know what you 
think.


Best,
Robert








Re: Flink release schedule

2019-02-08 Thread Fabian Hueske
I like the idea of a roadmap that is kept up-to-date.
We've tried that in the past a couple of times but they became outdated
rather quickly.
We've also had a some threads on the dev mailing list to discuss the scope
& features of next releases.

We could discuss and update the roadmap after every release (or even
feature freeze).
IMO, it would be good to have two things on in the document.

* a long-term vision (~1 year) where the project is heading to
* a roadmap for the next release (~3 months)

Cheers, Fabian

Am Di., 5. Feb. 2019 um 16:55 Uhr schrieb Robert Metzger <
rmetz...@apache.org>:

> Hey Ufuk,
>
> I agree! We should have regular discussions here on the list for scoping
> releases, and then put this to the website.
> The Beam project has a pretty nice roadmap [1] available on their site.
> I actually don't really know if there's an agreed upon release schedule in
> the Flink community (I've not been following the dev@ list closely
> recently).
> I'll try to find somebody who knows a bit more about the recent Flink
> community developments to put together a first draft for such a roadmap.
>
> [1] https://beam.apache.org/roadmap/
>
> On Mon, Feb 4, 2019 at 9:29 AM Ufuk Celebi  wrote:
>
> > Hey devs,
> >
> > I've been only following Flink dev loosely in the last couple of
> > months and could not find the following information:
> >
> > What's the schedule for the next release? In particular, when is the
> > planned feature freeze and when is the planned release date?
> >
> > I think this can be valuable information for both devs and users. What
> > do you think about adding this information to the Flink website? It
> > does not have to be exact dates, but general information such as "we
> > do time based releases every X months", "we support the last two minor
> > releases with patches", etc. I think this page [1] would be a good
> > target for this information.
> >
> > Best,
> >
> > Ufuk
> >
> >
> > [1] https://flink.apache.org/community.html
> >
>


Re: [VOTE] Release flink-shaded 6.0, release candidate 1

2019-02-08 Thread Fabian Hueske
Thank you Chesnay!

* Checked the changes since the 5.0 release and didn't find anything
surprising.
* Signatures and hash are correct
* build from source archive passed

+1 (binding)

Best, Fabian

Am Do., 7. Feb. 2019 um 09:48 Uhr schrieb Nico Kruber :

> +1,
>
> - Checked signature and hashes
> - Built locally from RC tgz
> - verified new Netty version working using the maven repo
> -> https://travis-ci.org/apache/flink/builds/489628017
>
> On 06/02/2019 13:59, Chesnay Schepler wrote:
> > Hi everyone,
> > Please review and vote on the release candidate #1 for the version 6.0,
> > as follows:
> > [ ] +1, Approve the release
> > [ ] -1, Do not approve the release (please provide specific comments)
> >
> >
> > The complete staging area is available for your review, which includes:
> > * JIRA release notes [1],
> > * the official Apache source release to be deployed to dist.apache.org
> > [2], which are signed with the key with fingerprint 11D464BA [3],
> > * all artifacts to be deployed to the Maven Central Repository [4],
> > * source code tag "release-6.0-rc1" [5].
> > * website pull request listing the new release  [6].
> >
> > The vote will be open for at least 72 hours. It is adopted by majority
> > approval, with at least 3 PMC affirmative votes.
> >
> > Thanks,
> > Chesnay
> >
> > [1]
> >
> https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12344544
> >
> > [2] https://dist.apache.org/repos/dist/dev/flink/flink-shaded-6.0-rc1/
> > [3] https://dist.apache.org/repos/dist/release/flink/KEYS
> > [4]
> https://repository.apache.org/content/repositories/orgapacheflink-1204
> > [5]
> >
> https://gitbox.apache.org/repos/asf?p=flink-shaded.git;a=tag;h=refs/tags/release-6.0-rc1
> >
> > [6] https://github.com/apache/flink-web/pull/154
> >
>
> --
> Nico Kruber | Solutions Architect
> --
> Join Flink Forward - The Apache Flink Conference
> Stream Processing | Event Driven | Real Time
>
> --
> Data Artisans GmbH | Stresemannstr. 121A,10963 Berlin, Germany
> data Artisans, Inc. | 1161 Mission Street, San Francisco, CA-94103, USA
> --
> Data Artisans GmbH
> Registered at Amtsgericht Charlottenburg: HRB 158244 B
> Managing Directors: Dr. Kostas Tzoumas, Dr. Stephan Ewen
>
>


Re: [DISCUSS] Clean up and reorganize the JIRA components

2019-02-08 Thread Timo Walther

Hi Robert,

thanks for starting this discussion. I was also about to suggest 
splitting the `Table API & SQL` component because it contains already 
more than 1000 issues.


My comments:

- Rename "SQL/Shell" to "SQL/Client" because the long-term goal might 
not only be a CLI interface. I would keep the generic name "SQL Client" 
for now. This is also what is written in FLIPs, presentations, and 
documentation.
- Rename "SQL/Query Planner" to "SQL/Planner" a query is read-only 
operation but we support things like INSERT INTO etc.. Planner is more 
generic.
- Rename "Gelly" to "Graph Processing". New users don't know what Gelly 
means. This is the only component that has a "feature name". I don't 
know if we want to stick with that in the future.
- Not sure about this: Introduce a "SQL/Connectors"? Because SQL 
connectors are tightly bound to SQL internals but also to the connector 
itself.
- Rename "Connectors/HCatalog" to "Connectors/Hive". This name is more 
generic and reflects the efforts about Hive Metastore and catalog 
integration that is currenlty taking place.


Thanks,
Timo


Am 08.02.19 um 12:39 schrieb Robert Metzger:

Hi all,

I am currently trying to improve how the Flink community is handling
incoming pull requests and JIRA tickets.

I've looked at how other big communities are handling such a high number of
contributions, and I found that many are using GitHub labels extensively.
An integral part of the label use is to tag PRs with the component / area
they belong to. I think the most obvious and logical way of tagging the PRs
is by using the JIRA components. This will force us to keep the JIRA
tickets well-organized, if we want the PRs to be organized :)
I will soon start a separate discussion for the GitHub labels.

Let's first discuss the JIRA components.

I've created the following Wiki page with my proposal of the new component,
and how to migrate from the existing components:
https://cwiki.apache.org/confluence/display/FLINK/Proposal+for+new+JIRA+Components

Please comment here or directly in the Wiki to let me know what you think.

Best,
Robert





Re: [DISCUSS] Enhance Operator API to Support Dynamically Selective Reading and EndOfInput Event

2019-02-08 Thread Stephan Ewen
Nice design proposal, and +1 to the general idea.

A few thoughts / suggestions:

*binary vs. n-ary*

I would plan ahead for N-ary operators. Not because we necessarily need
n-ary inputs (one can probably build that purely in the API) but because of
future side inputs. The proposal should be able to handle that as well.

*enum vs. integer*

The above might be easier is to realize when going directly with integer
and having ANY, FIRST, SECOND, etc. as pre-defined constants.
Performance wise, it is probably not difference whether to use int or enum.

*generic selectable interface*

>From the proposal, I don't understand quite what that interface is for. My
understanding is that the input processor or task that calls the
operators's functions would anyways work on the TwoInputStreamOperator
interface, for efficiency.

*end-input*

I think we should not make storing the end-input the operator's
responsibility
There is a simple way to handle this, which is also consistent with other
aspects of handling finished tasks:

  - If a task is finished, that should be stored in the checkpoint.
 - Upon restoring a finished task, if it has still running successors, we
deploy a "finished input channel", which immediately send the "end of
input" when task is started.
 - the operator will hence set the end of input immediately again upon

*early-out*

Letting nextSelection() return “NONE” or “FINISHED" may be relevant for
early-out cases, but I would remove this from the scope of this proposal.
There are most likely other big changes involved, like communicating this
to the upstream operators.

*distributed stream deadlocks*

We had this issue in the DataSet API. Earlier versions of the DataSet API
made an analysis of the flow detecting dams and whether the pipeline
breaking behavior in the flow would cause deadlocks, and introduce
artificial pipeline breakers in response.

The logic was really complicated and it took a while to become stable. We
had several issues that certain user functions (like mapPartition) could
either be pipelined or have a full dam (not possible to know for the
system), so we had to insert artificial pipeline breakers in all paths.

In the end we simply decided that in the case of a diamond-style flow, we
make the point where the flow first forks as blocking shuffle. That was
super simple, solved all issues, and has the additional nice property that
it great point to materialize data for recovery, because it helps both
paths of the diamond upon failure.

My suggestion:
==> For streaming, no problem so far, nothing to do
==> For batch, would suggest to go with the simple solution described above
first, and improve when we see cases where this impacts performance
significantly

*empty input / selection timeout*

I can see that being relevant in future streaming cases, for example with
side inputs. You want to wait for the side input data, but with a timeout,
so the program can still proceed with non-perfect context data in case that
context data is very late.

Because we do not support side inputs at the moment, we may want to defer
this for now. Let's not over-design for problems that are not well
understood at this point.

*timers*

I don't understand the problem with timers. Timers are bound to the
operator, not the input, so they should still work if an input ends.
There are cases where some state in the operator that is only relevant as
long as an input still has data (like in symmetric joins) and the timers
are relevant to that state.
When the state is dropped, the timers should also be dropped, but that is
the operator's logic on "endInput()". So there is no inherent issue between
input and timers.

Best,
Stephan


On Sat, Feb 2, 2019 at 3:55 AM Guowei Ma  wrote:

> Hi, guys:
> I propose a design to enhance Stream Operator API for Batch’s requirements.
> This is also the Flink’s goal that Batch is a special case of Streaming.
> This
> proposal mainly contains two changes to operator api:
>
> 1. Allow "StreamOperator" can choose which input to read;
> 2. Notify "StreamOperator" that an input has ended.
>
>
> This proposal was discussed with Piotr Nowojski, Kostas Kloudas, Haibo Sun
> offlline.
> It will be great to hear the feed backs and suggestions from the community.
> Please kindly share your comments and suggestions.
>
> Best
> GuoWei Ma.
>
>  Enhance Operator API to Support Dynamically Sel...
> <
> https://docs.google.com/document/d/10k5pQm3SkMiK5Zn1iFDqhQnzjQTLF0Vtcbc8poB4_c8/edit?usp=drive_web
> >
>


[DISCUSS] Clean up and reorganize the JIRA components

2019-02-08 Thread Robert Metzger
Hi all,

I am currently trying to improve how the Flink community is handling
incoming pull requests and JIRA tickets.

I've looked at how other big communities are handling such a high number of
contributions, and I found that many are using GitHub labels extensively.
An integral part of the label use is to tag PRs with the component / area
they belong to. I think the most obvious and logical way of tagging the PRs
is by using the JIRA components. This will force us to keep the JIRA
tickets well-organized, if we want the PRs to be organized :)
I will soon start a separate discussion for the GitHub labels.

Let's first discuss the JIRA components.

I've created the following Wiki page with my proposal of the new component,
and how to migrate from the existing components:
https://cwiki.apache.org/confluence/display/FLINK/Proposal+for+new+JIRA+Components

Please comment here or directly in the Wiki to let me know what you think.

Best,
Robert


[jira] [Created] (FLINK-11553) DispatcherHATest.testFailingRecoveryIsAFatalError fails locally with @Nonnull check

2019-02-08 Thread Andrey Zagrebin (JIRA)
Andrey Zagrebin created FLINK-11553:
---

 Summary: DispatcherHATest.testFailingRecoveryIsAFatalError fails 
locally with @Nonnull check
 Key: FLINK-11553
 URL: https://issues.apache.org/jira/browse/FLINK-11553
 Project: Flink
  Issue Type: Bug
  Components: Distributed Coordination, Tests
Affects Versions: 1.8.0
Reporter: Andrey Zagrebin
Assignee: Andrey Zagrebin
 Fix For: 1.8.0


DispatcherHATest.testFailingRecoveryIsAFatalError fails because it tries to 
instantiate 

HATestingDispatcher with fencingTokens = null which is annotated as @Nonnull.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


Re: [DISCUSS] Contributing Chinese website and docs to Apache Flink

2019-02-08 Thread Jark Wu
Hi Rong,

You raised a good question. The terminology translation is a difficult
problem because different people may prefer different translations.
So I would like to start a terminology translation specification discussion
(maybe it should be another thread) to reach a consensus in the community.

Here is the draft:
https://docs.google.com/document/d/1zhGyPU4bVJ7mCIdUTsQr9lHLn4vltejhngG_SOV6g1w/edit?usp=sharing
Maybe we can put it into flink website when we reach an agreement.

Best,
Jark


On Thu, 7 Feb 2019 at 02:14, Rong Rong  wrote:

> +1. Thanks for the proposal @Jark.
>
> I think utilizing review-bot will definitely be a good plus.
> I remember @Robert mentioned that there was an auto checklist
> functionality, maybe we can utilize that to flag/tag a specific PR is ready
> for documentation parity review?
>
> I would also like to follow up on a good question @Yun Tang brought up:
> sometimes an accurate translation from state-of-the-art English terminology
> is not available yet in Chinese. I was wondering what would be the best
> practices for reviewers or committers: creating a follow up JIRA is
> definitely one good solution but in the long run the differences might pile
> up.
>
> Looking forward to contributing to this effort!
>
> Thanks,
> Rong
>
>
>
> On Tue, Feb 5, 2019 at 4:10 AM Jark Wu  wrote:
>
> > Hi all,
> >
> > Thanks for the positive feedback we received so far.
> >
> > I have created two umbrella JIRA for tracking the efforts:
> >
> > For flink-web: https://issues.apache.org/jira/browse/FLINK-11526
> > For flink-docs: https://issues.apache.org/jira/browse/FLINK-11529
> >
> > I'm working on the framework changes for flink-web first.
> >
> > Please feel free to further join the contribution and translation.
> >
> > Best,
> > Jark
> >
> > On Tue, 5 Feb 2019 at 18:27, Stephan Ewen  wrote:
> >
> > > Nice proposal
> > >
> > > +1
> > >
> > > On Tue, Feb 5, 2019 at 8:16 AM Hequn Cheng 
> wrote:
> > >
> > > > Hi,
> > > >
> > > > big +1 for this!
> > > > Thank you Jark for writing the great document. I totally agree with
> the
> > > > proposal. It not only keep the English document same as before but
> also
> > > > proposes a way to integrate with Chinese ones.
> > > >
> > > > Best, Hequn
> > > >
> > > >
> > > >
> > > > On Mon, Feb 4, 2019 at 6:18 PM Fabian Hueske 
> > wrote:
> > > >
> > > > > Happy Chinese New Year!
> > > > >
> > > > > Thank you Jark for working on a prototype and writing up the this
> > great
> > > > > proposal.
> > > > > +1 for the suggested approach
> > > > >
> > > > > Best,
> > > > > Fabian
> > > > >
> > > > > Am Mo., 4. Feb. 2019 um 11:05 Uhr schrieb jincheng sun <
> > > > > sunjincheng...@gmail.com>:
> > > > >
> > > > > > +1 , Happy Chinese New Year! :)
> > > > > >
> > > > > > Best,Jincheng
> > > > > > From Singapore
> > > > > >
> > > > > > Jark Wu 于2019年2月4日 周一13:23写道:
> > > > > >
> > > > > >> Hi all,
> > > > > >>
> > > > > >> I have drafted the proposal:
> > > > > >>
> > > > > >>
> > > > >
> > > >
> > >
> >
> https://docs.google.com/document/d/1R1-uDq-KawLB8afQYrczfcoQHjjIhq6tvUksxrfhBl0/edit#
> > > > > >> Please feel free to give any feedbacks!
> > > > > >>
> > > > > >> I have also created a prototype to integrate Chinese for
> > flink-web,
> > > > here
> > > > > >> is
> > > > > >> the branch:
> > > https://github.com/wuchong/flink-web/tree/multi-language
> > > > > >>
> > > > > >> @Fabian Hueske   +1 to start contribute
> > > flink-web
> > > > > >> first.
> > > > > >>
> > > > > >> @Gordon That's a good idea to add the document checklist to the
> > > review
> > > > > >> process. I have included this in the proposal. What do you think
> > > > @Robert
> > > > > >> Metzger   @Fabian Hueske <
> fhue...@gmail.com>
> > > ?
> > > > > >>
> > > > > >> Btw, Happy Chinese New Year! :)
> > > > > >>
> > > > > >> Cheers, Jark
> > > > > >>
> > > > > >>
> > > > > >> On Fri, 1 Feb 2019 at 18:22, Tzu-Li (Gordon) Tai <
> > > tzuli...@apache.org
> > > > >
> > > > > >> wrote:
> > > > > >>
> > > > > >> > Hi,
> > > > > >> >
> > > > > >> > Great seeing efforts in pushing this forward, Jark!
> > > > > >> >
> > > > > >> > A bit late in the party here, but I just want to also point
> out
> > > > that:
> > > > > >> >
> > > > > >> > - +1 to https://flink.apache.org/zh/ as the website address
> > > > > >> > - Add to the contribution checklist in PRs that both English /
> > > > Chinese
> > > > > >> docs
> > > > > >> > have been updated accordingly.
> > > > > >> >
> > > > > >> > I want to point out that, if a contribution touches either the
> > > > English
> > > > > >> or
> > > > > >> > Chinese documentation, then both sides should be updated
> before
> > > the
> > > > > >> > contribution can be merged.
> > > > > >> > Otherwise, in the long run, it'll be very hard to keep track
> > which
> > > > > >> parts of
> > > > > >> > the documentation requires update.
> > > > > >> >
> > > > > >> > For this to work, we might need a list of known
> Chinese-speaking
> > > > > >> > contributors who would be will