Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-21 Thread Jungtaek Lim
Got it. I just removed 3.0.0 when there're multiple versions, except
SPARK-25431 which has (2.4.1, 3.0.0) pair since other version is targeting
to bugfix version.

https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%203.0.0


2018년 9월 21일 (금) 오후 4:05, Wenchen Fan 님이 작성:

> Thanks! If both versions are specified, yes we can just remove 3.0.0
>
> On Fri, Sep 21, 2018 at 1:38 PM Jungtaek Lim  wrote:
>
>> OK got it. Thanks for clarifying.
>>
>> I can help checking and modifying version, but not sure the case both
>> versions are specified, like "2.4.0/3.0.0". Removing 3.0.0 would work in
>> this case?
>>
>> 2018년 9월 21일 (금) 오후 2:29, Wenchen Fan 님이 작성:
>>
>>> There is an issue in the merge script, when resolving a ticket, the
>>> default fixed version is 3.0.0. I guess someone forgot to type the fixed
>>> version and lead to this mistake.
>>>
>>> On Fri, Sep 21, 2018 at 1:15 PM Jungtaek Lim  wrote:
>>>
>>>> Ah these issues were resolved before branch-2.4 is cut, like SPARK-24441
>>>>
>>>>
>>>> https://github.com/apache/spark/blob/v2.4.0-rc1/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala
>>>>
>>>> SPARK-24441 is included to Spark 2.4.0 RC1 but set to 3.0.0. I heard
>>>> there's a step which version of issues are aligned with new release when
>>>> branch/RC is being cut, but it doesn't look like happening for some issues.
>>>>
>>>> 2018년 9월 21일 (금) 오후 2:10, Holden Karau 님이 작성:
>>>>
>>>>> So normally during the release process if it's in branch-2.4 but not
>>>>> part of the current RC we set the resolved version to 2.4.1 and then if
>>>>> roll a new RC we switch the 2.4.1 issues to 2.4.0.
>>>>>
>>>>> On Thu, Sep 20, 2018 at 9:55 PM Jungtaek Lim 
>>>>> wrote:
>>>>>
>>>>>> I also noticed there're some fixed issues which are included in
>>>>>> branch-2.4 but its versions are still 3.0.0. Would we want to update
>>>>>> versions to 2.4.0? If we are not planning to run some automations to
>>>>>> correct it, I'm happy to fix them.
>>>>>>
>>>>>> 2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이
>>>>>> 작성:
>>>>>>
>>>>>>> We need to merge this.
>>>>>>> https://github.com/apache/spark/pull/22492
>>>>>>> Otherwise mleap cannot build against spark 2.4.0
>>>>>>> Thanks!
>>>>>>>
>>>>>>> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> FYI: SPARK-23200 has been resolved.
>>>>>>>>
>>>>>>>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung <
>>>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>>>
>>>>>>>>> If we could work on this quickly - it might get on to future RCs.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> *From:* Stavros Kontopoulos 
>>>>>>>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>>>>>>>> *To:* Yinan Li
>>>>>>>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid;
>>>>>>>>> Sean Owen; Wenchen Fan; dev
>>>>>>>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>>>>>>>
>>>>>>>>> Hi Xiao,
>>>>>>>>>
>>>>>>>>> I just tested it, it seems ok. There are some questions about
>>>>>>>>> which properties we should keep when restoring the config. Otherwise 
>>>>>>>>> it
>>>>>>>>> looks ok to me.
>>>>>>>>> The reason this should go in 2.4 is that streaming on k8s is
>>>>>>>>> something people want to try day one (or at least it is cool to try) 
>>>>>>>>> and
>>>>>>>>> since 2.4 comes with k8s support being refactored a lot,
>>>>>>>>> it would be disappointing not to have it in...IMHO.
>>>>>>>>>
>>>>>>>>> Best,
>>>>>

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-21 Thread Wenchen Fan
Thanks! If both versions are specified, yes we can just remove 3.0.0

On Fri, Sep 21, 2018 at 1:38 PM Jungtaek Lim  wrote:

> OK got it. Thanks for clarifying.
>
> I can help checking and modifying version, but not sure the case both
> versions are specified, like "2.4.0/3.0.0". Removing 3.0.0 would work in
> this case?
>
> 2018년 9월 21일 (금) 오후 2:29, Wenchen Fan 님이 작성:
>
>> There is an issue in the merge script, when resolving a ticket, the
>> default fixed version is 3.0.0. I guess someone forgot to type the fixed
>> version and lead to this mistake.
>>
>> On Fri, Sep 21, 2018 at 1:15 PM Jungtaek Lim  wrote:
>>
>>> Ah these issues were resolved before branch-2.4 is cut, like SPARK-24441
>>>
>>>
>>> https://github.com/apache/spark/blob/v2.4.0-rc1/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala
>>>
>>> SPARK-24441 is included to Spark 2.4.0 RC1 but set to 3.0.0. I heard
>>> there's a step which version of issues are aligned with new release when
>>> branch/RC is being cut, but it doesn't look like happening for some issues.
>>>
>>> 2018년 9월 21일 (금) 오후 2:10, Holden Karau 님이 작성:
>>>
>>>> So normally during the release process if it's in branch-2.4 but not
>>>> part of the current RC we set the resolved version to 2.4.1 and then if
>>>> roll a new RC we switch the 2.4.1 issues to 2.4.0.
>>>>
>>>> On Thu, Sep 20, 2018 at 9:55 PM Jungtaek Lim  wrote:
>>>>
>>>>> I also noticed there're some fixed issues which are included in
>>>>> branch-2.4 but its versions are still 3.0.0. Would we want to update
>>>>> versions to 2.4.0? If we are not planning to run some automations to
>>>>> correct it, I'm happy to fix them.
>>>>>
>>>>> 2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이 작성:
>>>>>
>>>>>> We need to merge this.
>>>>>> https://github.com/apache/spark/pull/22492
>>>>>> Otherwise mleap cannot build against spark 2.4.0
>>>>>> Thanks!
>>>>>>
>>>>>> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li 
>>>>>> wrote:
>>>>>>
>>>>>>> FYI: SPARK-23200 has been resolved.
>>>>>>>
>>>>>>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung <
>>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>>
>>>>>>>> If we could work on this quickly - it might get on to future RCs.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> *From:* Stavros Kontopoulos 
>>>>>>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>>>>>>> *To:* Yinan Li
>>>>>>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid;
>>>>>>>> Sean Owen; Wenchen Fan; dev
>>>>>>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>>>>>>
>>>>>>>> Hi Xiao,
>>>>>>>>
>>>>>>>> I just tested it, it seems ok. There are some questions about which
>>>>>>>> properties we should keep when restoring the config. Otherwise it 
>>>>>>>> looks ok
>>>>>>>> to me.
>>>>>>>> The reason this should go in 2.4 is that streaming on k8s is
>>>>>>>> something people want to try day one (or at least it is cool to try) 
>>>>>>>> and
>>>>>>>> since 2.4 comes with k8s support being refactored a lot,
>>>>>>>> it would be disappointing not to have it in...IMHO.
>>>>>>>>
>>>>>>>> Best,
>>>>>>>> Stavros
>>>>>>>>
>>>>>>>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> We can merge the PR and get SPARK-23200 resolved if the whole
>>>>>>>>> point is to make streaming on k8s work first. But given that this is 
>>>>>>>>> not a
>>>>>>>>> blocker for 2.4, I think we can take a bit more time here and get it 
>>>>>>>>> right.
>>>>>>>>> With that being said, I would expect it to

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-20 Thread Jungtaek Lim
OK got it. Thanks for clarifying.

I can help checking and modifying version, but not sure the case both
versions are specified, like "2.4.0/3.0.0". Removing 3.0.0 would work in
this case?

2018년 9월 21일 (금) 오후 2:29, Wenchen Fan 님이 작성:

> There is an issue in the merge script, when resolving a ticket, the
> default fixed version is 3.0.0. I guess someone forgot to type the fixed
> version and lead to this mistake.
>
> On Fri, Sep 21, 2018 at 1:15 PM Jungtaek Lim  wrote:
>
>> Ah these issues were resolved before branch-2.4 is cut, like SPARK-24441
>>
>>
>> https://github.com/apache/spark/blob/v2.4.0-rc1/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala
>>
>> SPARK-24441 is included to Spark 2.4.0 RC1 but set to 3.0.0. I heard
>> there's a step which version of issues are aligned with new release when
>> branch/RC is being cut, but it doesn't look like happening for some issues.
>>
>> 2018년 9월 21일 (금) 오후 2:10, Holden Karau 님이 작성:
>>
>>> So normally during the release process if it's in branch-2.4 but not
>>> part of the current RC we set the resolved version to 2.4.1 and then if
>>> roll a new RC we switch the 2.4.1 issues to 2.4.0.
>>>
>>> On Thu, Sep 20, 2018 at 9:55 PM Jungtaek Lim  wrote:
>>>
>>>> I also noticed there're some fixed issues which are included in
>>>> branch-2.4 but its versions are still 3.0.0. Would we want to update
>>>> versions to 2.4.0? If we are not planning to run some automations to
>>>> correct it, I'm happy to fix them.
>>>>
>>>> 2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이 작성:
>>>>
>>>>> We need to merge this.
>>>>> https://github.com/apache/spark/pull/22492
>>>>> Otherwise mleap cannot build against spark 2.4.0
>>>>> Thanks!
>>>>>
>>>>> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li  wrote:
>>>>>
>>>>>> FYI: SPARK-23200 has been resolved.
>>>>>>
>>>>>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung <
>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>
>>>>>>> If we could work on this quickly - it might get on to future RCs.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> *From:* Stavros Kontopoulos 
>>>>>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>>>>>> *To:* Yinan Li
>>>>>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid;
>>>>>>> Sean Owen; Wenchen Fan; dev
>>>>>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>>>>>
>>>>>>> Hi Xiao,
>>>>>>>
>>>>>>> I just tested it, it seems ok. There are some questions about which
>>>>>>> properties we should keep when restoring the config. Otherwise it looks 
>>>>>>> ok
>>>>>>> to me.
>>>>>>> The reason this should go in 2.4 is that streaming on k8s is
>>>>>>> something people want to try day one (or at least it is cool to try) and
>>>>>>> since 2.4 comes with k8s support being refactored a lot,
>>>>>>> it would be disappointing not to have it in...IMHO.
>>>>>>>
>>>>>>> Best,
>>>>>>> Stavros
>>>>>>>
>>>>>>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> We can merge the PR and get SPARK-23200 resolved if the whole point
>>>>>>>> is to make streaming on k8s work first. But given that this is not a
>>>>>>>> blocker for 2.4, I think we can take a bit more time here and get it 
>>>>>>>> right.
>>>>>>>> With that being said, I would expect it to be resolved soon.
>>>>>>>>
>>>>>>>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li 
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Hi, Erik and Stavros,
>>>>>>>>>
>>>>>>>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It
>>>>>>>>> sounds important for the Streaming on K8S. Could the K8S oriented
>>>>>>>>> committers speed up the

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-20 Thread Wenchen Fan
There is an issue in the merge script, when resolving a ticket, the default
fixed version is 3.0.0. I guess someone forgot to type the fixed version
and lead to this mistake.

On Fri, Sep 21, 2018 at 1:15 PM Jungtaek Lim  wrote:

> Ah these issues were resolved before branch-2.4 is cut, like SPARK-24441
>
>
> https://github.com/apache/spark/blob/v2.4.0-rc1/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala
>
> SPARK-24441 is included to Spark 2.4.0 RC1 but set to 3.0.0. I heard
> there's a step which version of issues are aligned with new release when
> branch/RC is being cut, but it doesn't look like happening for some issues.
>
> 2018년 9월 21일 (금) 오후 2:10, Holden Karau 님이 작성:
>
>> So normally during the release process if it's in branch-2.4 but not part
>> of the current RC we set the resolved version to 2.4.1 and then if roll a
>> new RC we switch the 2.4.1 issues to 2.4.0.
>>
>> On Thu, Sep 20, 2018 at 9:55 PM Jungtaek Lim  wrote:
>>
>>> I also noticed there're some fixed issues which are included in
>>> branch-2.4 but its versions are still 3.0.0. Would we want to update
>>> versions to 2.4.0? If we are not planning to run some automations to
>>> correct it, I'm happy to fix them.
>>>
>>> 2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이 작성:
>>>
>>>> We need to merge this.
>>>> https://github.com/apache/spark/pull/22492
>>>> Otherwise mleap cannot build against spark 2.4.0
>>>> Thanks!
>>>>
>>>> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li  wrote:
>>>>
>>>>> FYI: SPARK-23200 has been resolved.
>>>>>
>>>>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung <
>>>>> felixcheun...@hotmail.com> wrote:
>>>>>
>>>>>> If we could work on this quickly - it might get on to future RCs.
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> *From:* Stavros Kontopoulos 
>>>>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>>>>> *To:* Yinan Li
>>>>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid;
>>>>>> Sean Owen; Wenchen Fan; dev
>>>>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>>>>
>>>>>> Hi Xiao,
>>>>>>
>>>>>> I just tested it, it seems ok. There are some questions about which
>>>>>> properties we should keep when restoring the config. Otherwise it looks 
>>>>>> ok
>>>>>> to me.
>>>>>> The reason this should go in 2.4 is that streaming on k8s is
>>>>>> something people want to try day one (or at least it is cool to try) and
>>>>>> since 2.4 comes with k8s support being refactored a lot,
>>>>>> it would be disappointing not to have it in...IMHO.
>>>>>>
>>>>>> Best,
>>>>>> Stavros
>>>>>>
>>>>>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
>>>>>> wrote:
>>>>>>
>>>>>>> We can merge the PR and get SPARK-23200 resolved if the whole point
>>>>>>> is to make streaming on k8s work first. But given that this is not a
>>>>>>> blocker for 2.4, I think we can take a bit more time here and get it 
>>>>>>> right.
>>>>>>> With that being said, I would expect it to be resolved soon.
>>>>>>>
>>>>>>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li 
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi, Erik and Stavros,
>>>>>>>>
>>>>>>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It
>>>>>>>> sounds important for the Streaming on K8S. Could the K8S oriented
>>>>>>>> committers speed up the reviews?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>>
>>>>>>>> Xiao
>>>>>>>>
>>>>>>>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>>>>>>>
>>>>>>>>>
>>>>>>>>> I have no binding vote but I second Stavros’ recommendation for
>>>>>>>>> spark-23200
>>>>>>>>>
>>>>>>>>> Per parallel threads on Py2 support I

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-20 Thread Jungtaek Lim
Ah these issues were resolved before branch-2.4 is cut, like SPARK-24441

https://github.com/apache/spark/blob/v2.4.0-rc1/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/HDFSBackedStateStoreProvider.scala

SPARK-24441 is included to Spark 2.4.0 RC1 but set to 3.0.0. I heard
there's a step which version of issues are aligned with new release when
branch/RC is being cut, but it doesn't look like happening for some issues.

2018년 9월 21일 (금) 오후 2:10, Holden Karau 님이 작성:

> So normally during the release process if it's in branch-2.4 but not part
> of the current RC we set the resolved version to 2.4.1 and then if roll a
> new RC we switch the 2.4.1 issues to 2.4.0.
>
> On Thu, Sep 20, 2018 at 9:55 PM Jungtaek Lim  wrote:
>
>> I also noticed there're some fixed issues which are included in
>> branch-2.4 but its versions are still 3.0.0. Would we want to update
>> versions to 2.4.0? If we are not planning to run some automations to
>> correct it, I'm happy to fix them.
>>
>> 2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이 작성:
>>
>>> We need to merge this.
>>> https://github.com/apache/spark/pull/22492
>>> Otherwise mleap cannot build against spark 2.4.0
>>> Thanks!
>>>
>>> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li  wrote:
>>>
>>>> FYI: SPARK-23200 has been resolved.
>>>>
>>>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung 
>>>> wrote:
>>>>
>>>>> If we could work on this quickly - it might get on to future RCs.
>>>>>
>>>>>
>>>>>
>>>>> ------------------
>>>>> *From:* Stavros Kontopoulos 
>>>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>>>> *To:* Yinan Li
>>>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean
>>>>> Owen; Wenchen Fan; dev
>>>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>>>
>>>>> Hi Xiao,
>>>>>
>>>>> I just tested it, it seems ok. There are some questions about which
>>>>> properties we should keep when restoring the config. Otherwise it looks ok
>>>>> to me.
>>>>> The reason this should go in 2.4 is that streaming on k8s is something
>>>>> people want to try day one (or at least it is cool to try) and since 2.4
>>>>> comes with k8s support being refactored a lot,
>>>>> it would be disappointing not to have it in...IMHO.
>>>>>
>>>>> Best,
>>>>> Stavros
>>>>>
>>>>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
>>>>> wrote:
>>>>>
>>>>>> We can merge the PR and get SPARK-23200 resolved if the whole point
>>>>>> is to make streaming on k8s work first. But given that this is not a
>>>>>> blocker for 2.4, I think we can take a bit more time here and get it 
>>>>>> right.
>>>>>> With that being said, I would expect it to be resolved soon.
>>>>>>
>>>>>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li 
>>>>>> wrote:
>>>>>>
>>>>>>> Hi, Erik and Stavros,
>>>>>>>
>>>>>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It
>>>>>>> sounds important for the Streaming on K8S. Could the K8S oriented
>>>>>>> committers speed up the reviews?
>>>>>>>
>>>>>>> Thanks,
>>>>>>>
>>>>>>> Xiao
>>>>>>>
>>>>>>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>>>>>>
>>>>>>>>
>>>>>>>> I have no binding vote but I second Stavros’ recommendation for
>>>>>>>> spark-23200
>>>>>>>>
>>>>>>>> Per parallel threads on Py2 support I would also like to propose
>>>>>>>> deprecating Py2 starting with this 2.4 release
>>>>>>>>
>>>>>>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>>>>>>  wrote:
>>>>>>>>
>>>>>>>>> You can log in to https://repository.apache.org and see what's
>>>>>>>>> wrong.
>>>>>>>>> Just find that staging repo and look at the messages. In your case
>>>>>>>>> it
>>>>>>>>> seems related to your signature.

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-20 Thread Holden Karau
So normally during the release process if it's in branch-2.4 but not part
of the current RC we set the resolved version to 2.4.1 and then if roll a
new RC we switch the 2.4.1 issues to 2.4.0.

On Thu, Sep 20, 2018 at 9:55 PM Jungtaek Lim  wrote:

> I also noticed there're some fixed issues which are included in branch-2.4
> but its versions are still 3.0.0. Would we want to update versions to
> 2.4.0? If we are not planning to run some automations to correct it, I'm
> happy to fix them.
>
> 2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이 작성:
>
>> We need to merge this.
>> https://github.com/apache/spark/pull/22492
>> Otherwise mleap cannot build against spark 2.4.0
>> Thanks!
>>
>> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li  wrote:
>>
>>> FYI: SPARK-23200 has been resolved.
>>>
>>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung 
>>> wrote:
>>>
>>>> If we could work on this quickly - it might get on to future RCs.
>>>>
>>>>
>>>>
>>>> --
>>>> *From:* Stavros Kontopoulos 
>>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>>> *To:* Yinan Li
>>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean
>>>> Owen; Wenchen Fan; dev
>>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>>
>>>> Hi Xiao,
>>>>
>>>> I just tested it, it seems ok. There are some questions about which
>>>> properties we should keep when restoring the config. Otherwise it looks ok
>>>> to me.
>>>> The reason this should go in 2.4 is that streaming on k8s is something
>>>> people want to try day one (or at least it is cool to try) and since 2.4
>>>> comes with k8s support being refactored a lot,
>>>> it would be disappointing not to have it in...IMHO.
>>>>
>>>> Best,
>>>> Stavros
>>>>
>>>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
>>>> wrote:
>>>>
>>>>> We can merge the PR and get SPARK-23200 resolved if the whole point is
>>>>> to make streaming on k8s work first. But given that this is not a blocker
>>>>> for 2.4, I think we can take a bit more time here and get it right. With
>>>>> that being said, I would expect it to be resolved soon.
>>>>>
>>>>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li  wrote:
>>>>>
>>>>>> Hi, Erik and Stavros,
>>>>>>
>>>>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It
>>>>>> sounds important for the Streaming on K8S. Could the K8S oriented
>>>>>> committers speed up the reviews?
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> Xiao
>>>>>>
>>>>>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>>>>>
>>>>>>>
>>>>>>> I have no binding vote but I second Stavros’ recommendation for
>>>>>>> spark-23200
>>>>>>>
>>>>>>> Per parallel threads on Py2 support I would also like to propose
>>>>>>> deprecating Py2 starting with this 2.4 release
>>>>>>>
>>>>>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>>>>>  wrote:
>>>>>>>
>>>>>>>> You can log in to https://repository.apache.org and see what's
>>>>>>>> wrong.
>>>>>>>> Just find that staging repo and look at the messages. In your case
>>>>>>>> it
>>>>>>>> seems related to your signature.
>>>>>>>>
>>>>>>>> failureMessageNo public key: Key with id: () was not able to be
>>>>>>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>>>>>>> the operation again.
>>>>>>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
>>>>>>>> wrote:
>>>>>>>> >
>>>>>>>> > I confirmed that
>>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>>>>>>> is not accessible. I did it via 
>>>>>>>> ./dev/create-release/do-release-docker.sh
>>>>>>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see 
>>>>>>>> any
>>>

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-20 Thread Jungtaek Lim
I also noticed there're some fixed issues which are included in branch-2.4
but its versions are still 3.0.0. Would we want to update versions to
2.4.0? If we are not planning to run some automations to correct it, I'm
happy to fix them.

2018년 9월 20일 (목) 오후 9:22, Weichen Xu 님이 작성:

> We need to merge this.
> https://github.com/apache/spark/pull/22492
> Otherwise mleap cannot build against spark 2.4.0
> Thanks!
>
> On Wed, Sep 19, 2018 at 1:16 PM Yinan Li  wrote:
>
>> FYI: SPARK-23200 has been resolved.
>>
>> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung 
>> wrote:
>>
>>> If we could work on this quickly - it might get on to future RCs.
>>>
>>>
>>>
>>> --
>>> *From:* Stavros Kontopoulos 
>>> *Sent:* Monday, September 17, 2018 2:35 PM
>>> *To:* Yinan Li
>>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean
>>> Owen; Wenchen Fan; dev
>>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>>
>>> Hi Xiao,
>>>
>>> I just tested it, it seems ok. There are some questions about which
>>> properties we should keep when restoring the config. Otherwise it looks ok
>>> to me.
>>> The reason this should go in 2.4 is that streaming on k8s is something
>>> people want to try day one (or at least it is cool to try) and since 2.4
>>> comes with k8s support being refactored a lot,
>>> it would be disappointing not to have it in...IMHO.
>>>
>>> Best,
>>> Stavros
>>>
>>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li  wrote:
>>>
>>>> We can merge the PR and get SPARK-23200 resolved if the whole point is
>>>> to make streaming on k8s work first. But given that this is not a blocker
>>>> for 2.4, I think we can take a bit more time here and get it right. With
>>>> that being said, I would expect it to be resolved soon.
>>>>
>>>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li  wrote:
>>>>
>>>>> Hi, Erik and Stavros,
>>>>>
>>>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It
>>>>> sounds important for the Streaming on K8S. Could the K8S oriented
>>>>> committers speed up the reviews?
>>>>>
>>>>> Thanks,
>>>>>
>>>>> Xiao
>>>>>
>>>>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>>>>
>>>>>>
>>>>>> I have no binding vote but I second Stavros’ recommendation for
>>>>>> spark-23200
>>>>>>
>>>>>> Per parallel threads on Py2 support I would also like to propose
>>>>>> deprecating Py2 starting with this 2.4 release
>>>>>>
>>>>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>>>>  wrote:
>>>>>>
>>>>>>> You can log in to https://repository.apache.org and see what's
>>>>>>> wrong.
>>>>>>> Just find that staging repo and look at the messages. In your case it
>>>>>>> seems related to your signature.
>>>>>>>
>>>>>>> failureMessageNo public key: Key with id: () was not able to be
>>>>>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>>>>>> the operation again.
>>>>>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
>>>>>>> wrote:
>>>>>>> >
>>>>>>> > I confirmed that
>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>>>>>> is not accessible. I did it via 
>>>>>>> ./dev/create-release/do-release-docker.sh
>>>>>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see 
>>>>>>> any
>>>>>>> error message during it.
>>>>>>> >
>>>>>>> > Any insights are appreciated! So that I can fix it in the next RC.
>>>>>>> Thanks!
>>>>>>> >
>>>>>>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen 
>>>>>>> wrote:
>>>>>>> >>
>>>>>>> >> I think one build is enough, but haven't thought it through. The
>>>>>>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is
>>>>>>> probably
>>>>>

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-20 Thread Weichen Xu
We need to merge this.
https://github.com/apache/spark/pull/22492
Otherwise mleap cannot build against spark 2.4.0
Thanks!

On Wed, Sep 19, 2018 at 1:16 PM Yinan Li  wrote:

> FYI: SPARK-23200 has been resolved.
>
> On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung 
> wrote:
>
>> If we could work on this quickly - it might get on to future RCs.
>>
>>
>>
>> --
>> *From:* Stavros Kontopoulos 
>> *Sent:* Monday, September 17, 2018 2:35 PM
>> *To:* Yinan Li
>> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean
>> Owen; Wenchen Fan; dev
>> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>>
>> Hi Xiao,
>>
>> I just tested it, it seems ok. There are some questions about which
>> properties we should keep when restoring the config. Otherwise it looks ok
>> to me.
>> The reason this should go in 2.4 is that streaming on k8s is something
>> people want to try day one (or at least it is cool to try) and since 2.4
>> comes with k8s support being refactored a lot,
>> it would be disappointing not to have it in...IMHO.
>>
>> Best,
>> Stavros
>>
>> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li  wrote:
>>
>>> We can merge the PR and get SPARK-23200 resolved if the whole point is
>>> to make streaming on k8s work first. But given that this is not a blocker
>>> for 2.4, I think we can take a bit more time here and get it right. With
>>> that being said, I would expect it to be resolved soon.
>>>
>>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li  wrote:
>>>
>>>> Hi, Erik and Stavros,
>>>>
>>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds
>>>> important for the Streaming on K8S. Could the K8S oriented committers speed
>>>> up the reviews?
>>>>
>>>> Thanks,
>>>>
>>>> Xiao
>>>>
>>>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>>>
>>>>>
>>>>> I have no binding vote but I second Stavros’ recommendation for
>>>>> spark-23200
>>>>>
>>>>> Per parallel threads on Py2 support I would also like to propose
>>>>> deprecating Py2 starting with this 2.4 release
>>>>>
>>>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>>>  wrote:
>>>>>
>>>>>> You can log in to https://repository.apache.org and see what's wrong.
>>>>>> Just find that staging repo and look at the messages. In your case it
>>>>>> seems related to your signature.
>>>>>>
>>>>>> failureMessageNo public key: Key with id: () was not able to be
>>>>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>>>>> the operation again.
>>>>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
>>>>>> wrote:
>>>>>> >
>>>>>> > I confirmed that
>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>>>>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>>>>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see 
>>>>>> any
>>>>>> error message during it.
>>>>>> >
>>>>>> > Any insights are appreciated! So that I can fix it in the next RC.
>>>>>> Thanks!
>>>>>> >
>>>>>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen 
>>>>>> wrote:
>>>>>> >>
>>>>>> >> I think one build is enough, but haven't thought it through. The
>>>>>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is
>>>>>> probably
>>>>>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of
>>>>>> it?
>>>>>> >> Really, whatever's the easy thing to do.
>>>>>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>>>>>> wrote:
>>>>>> >> >
>>>>>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish
>>>>>> a Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: 
>>>>>> with
>>>>>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing 
>>>>>> for
>>>>>>

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-18 Thread Yinan Li
FYI: SPARK-23200 has been resolved.

On Tue, Sep 18, 2018 at 8:49 AM Felix Cheung 
wrote:

> If we could work on this quickly - it might get on to future RCs.
>
>
>
> --
> *From:* Stavros Kontopoulos 
> *Sent:* Monday, September 17, 2018 2:35 PM
> *To:* Yinan Li
> *Cc:* Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean
> Owen; Wenchen Fan; dev
> *Subject:* Re: [VOTE] SPARK 2.4.0 (RC1)
>
> Hi Xiao,
>
> I just tested it, it seems ok. There are some questions about which
> properties we should keep when restoring the config. Otherwise it looks ok
> to me.
> The reason this should go in 2.4 is that streaming on k8s is something
> people want to try day one (or at least it is cool to try) and since 2.4
> comes with k8s support being refactored a lot,
> it would be disappointing not to have it in...IMHO.
>
> Best,
> Stavros
>
> On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li  wrote:
>
>> We can merge the PR and get SPARK-23200 resolved if the whole point is to
>> make streaming on k8s work first. But given that this is not a blocker for
>> 2.4, I think we can take a bit more time here and get it right. With that
>> being said, I would expect it to be resolved soon.
>>
>> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li  wrote:
>>
>>> Hi, Erik and Stavros,
>>>
>>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds
>>> important for the Streaming on K8S. Could the K8S oriented committers speed
>>> up the reviews?
>>>
>>> Thanks,
>>>
>>> Xiao
>>>
>>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>>
>>>>
>>>> I have no binding vote but I second Stavros’ recommendation for
>>>> spark-23200
>>>>
>>>> Per parallel threads on Py2 support I would also like to propose
>>>> deprecating Py2 starting with this 2.4 release
>>>>
>>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>>  wrote:
>>>>
>>>>> You can log in to https://repository.apache.org and see what's wrong.
>>>>> Just find that staging repo and look at the messages. In your case it
>>>>> seems related to your signature.
>>>>>
>>>>> failureMessageNo public key: Key with id: () was not able to be
>>>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>>>> the operation again.
>>>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
>>>>> wrote:
>>>>> >
>>>>> > I confirmed that
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>>>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>>>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
>>>>> error message during it.
>>>>> >
>>>>> > Any insights are appreciated! So that I can fix it in the next RC.
>>>>> Thanks!
>>>>> >
>>>>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen 
>>>>> wrote:
>>>>> >>
>>>>> >> I think one build is enough, but haven't thought it through. The
>>>>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>>>>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of
>>>>> it?
>>>>> >> Really, whatever's the easy thing to do.
>>>>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>>>>> wrote:
>>>>> >> >
>>>>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
>>>>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>>>>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing 
>>>>> for
>>>>> Scala 2.12?
>>>>> >> >
>>>>> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>>>>> wrote:
>>>>> >> >>
>>>>> >> >> A few preliminary notes:
>>>>> >> >>
>>>>> >> >> Wenchen for some weird reason when I hit your key in gpg
>>>>> --import, it
>>>>> >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
>>>>> verify
>>>>> >> >> the signature. No issue there really.
>>>

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-18 Thread Felix Cheung
If we could work on this quickly - it might get on to future RCs.




From: Stavros Kontopoulos 
Sent: Monday, September 17, 2018 2:35 PM
To: Yinan Li
Cc: Xiao Li; eerla...@redhat.com; van...@cloudera.com.invalid; Sean Owen; 
Wenchen Fan; dev
Subject: Re: [VOTE] SPARK 2.4.0 (RC1)

Hi Xiao,

I just tested it, it seems ok. There are some questions about which properties 
we should keep when restoring the config. Otherwise it looks ok to me.
The reason this should go in 2.4 is that streaming on k8s is something people 
want to try day one (or at least it is cool to try) and since 2.4 comes with 
k8s support being refactored a lot,
it would be disappointing not to have it in...IMHO.

Best,
Stavros

On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li 
mailto:liyinan...@gmail.com>> wrote:
We can merge the PR and get SPARK-23200 resolved if the whole point is to make 
streaming on k8s work first. But given that this is not a blocker for 2.4, I 
think we can take a bit more time here and get it right. With that being said, 
I would expect it to be resolved soon.

On Mon, Sep 17, 2018 at 11:47 AM Xiao Li 
mailto:gatorsm...@gmail.com>> wrote:
Hi, Erik and Stavros,

This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds 
important for the Streaming on K8S. Could the K8S oriented committers speed up 
the reviews?

Thanks,

Xiao

Erik Erlandson mailto:eerla...@redhat.com>> 于2018年9月17日周一 
上午11:04写道:

I have no binding vote but I second Stavros’ recommendation for spark-23200

Per parallel threads on Py2 support I would also like to propose deprecating 
Py2 starting with this 2.4 release

On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin  
wrote:
You can log in to https://repository.apache.org and see what's wrong.
Just find that staging repo and look at the messages. In your case it
seems related to your signature.

failureMessageNo public key: Key with id: () was not able to be
located on http://gpg-keyserver.de/. Upload your public key and try
the operation again.
On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
mailto:cloud0...@gmail.com>> wrote:
>
> I confirmed that 
> https://repository.apache.org/content/repositories/orgapachespark-1285 is not 
> accessible. I did it via ./dev/create-release/do-release-docker.sh -d 
> /my/work/dir -s publish , not sure what's going wrong. I didn't see any error 
> message during it.
>
> Any insights are appreciated! So that I can fix it in the next RC. Thanks!
>
> On Mon, Sep 17, 2018 at 11:31 AM Sean Owen 
> mailto:sro...@apache.org>> wrote:
>>
>> I think one build is enough, but haven't thought it through. The
>> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>> Really, whatever's the easy thing to do.
>> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>> mailto:cloud0...@gmail.com>> wrote:
>> >
>> > Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala 
>> > 2.12 build this time? Current for Scala 2.11 we have 3 builds: with hadoop 
>> > 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for Scala 
>> > 2.12?
>> >
>> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>> > mailto:sro...@apache.org>> wrote:
>> >>
>> >> A few preliminary notes:
>> >>
>> >> Wenchen for some weird reason when I hit your key in gpg --import, it
>> >> asks for a passphrase. When I skip it, it's fine, gpg can still verify
>> >> the signature. No issue there really.
>> >>
>> >> The staging repo gives a 404:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>> >> [id=orgapachespark-1285] exists but is not exposed.
>> >>
>> >> The (revamped) licenses are OK, though there are some minor glitches
>> >> in the final release tarballs (my fault) : there's an extra directory,
>> >> and the source release has both binary and source licenses. I'll fix
>> >> that. Not strictly necessary to reject the release over those.
>> >>
>> >> Last, when I check the staging repo I'll get my answer, but, were you
>> >> able to build 2.12 artifacts as well?
>> >>
>> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>> >> mailto:cloud0...@gmail.com>> wrote:
>> >> >
>> >> > Please vote on releasing the following candidate as Apache Spark 
>> >> > version 2.4.0.
>> >> >
>> >> > The vote is open until September 20 PST and passes if a majority +1 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-18 Thread Wenchen Fan
Thanks Marcelo to point out my gpg key issue! I've re-generated it and
uploaded to ASF spark repo. Let's see if it works in the next RC.

Thanks Saisai to point out the Python doc issue, I'll fix it in the next RC.

This RC fails because:
1. it doesn't include a Scala 2.12 build
2. the gpg key issue
3. the Python doc issue
4. some other potential blocker issues.

I'll start RC2 once these blocker issues are either resolved or we decide
to mark them as non-blocker.

Thanks,
Wenchen

On Tue, Sep 18, 2018 at 9:48 PM Marco Gaido  wrote:

> Sorry but I am -1 because of what was reported here:
> https://issues.apache.org/jira/browse/SPARK-22036?focusedCommentId=16618104=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16618104
> .
> It is a regression unfortunately. Despite the impact is not huge and there
> are workarounds, I think we should include the fix in 2.4.0. I created
> SPARK-25454 and submitted a PR for it.
> Sorry for the trouble.
>
> Il giorno mar 18 set 2018 alle ore 05:23 Holden Karau <
> hol...@pigscanfly.ca> ha scritto:
>
>> Deprecating Py 2 in the 2.4 release probably doesn't belong in the RC
>> vote thread. Personally I think we might be a little too late in the game
>> to deprecate it in 2.4, but I think calling it out as "soon to be
>> deprecated" in the release docs would be sensible to give folks extra time
>> to prepare.
>>
>> On Mon, Sep 17, 2018 at 2:04 PM Erik Erlandson 
>> wrote:
>>
>>>
>>> I have no binding vote but I second Stavros’ recommendation for
>>> spark-23200
>>>
>>> Per parallel threads on Py2 support I would also like to propose
>>> deprecating Py2 starting with this 2.4 release
>>>
>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>  wrote:
>>>
 You can log in to https://repository.apache.org and see what's wrong.
 Just find that staging repo and look at the messages. In your case it
 seems related to your signature.

 failureMessageNo public key: Key with id: () was not able to be
 located on http://gpg-keyserver.de/. Upload your public key and try
 the operation again.
 On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
 wrote:
 >
 > I confirmed that
 https://repository.apache.org/content/repositories/orgapachespark-1285
 is not accessible. I did it via ./dev/create-release/do-release-docker.sh
 -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
 error message during it.
 >
 > Any insights are appreciated! So that I can fix it in the next RC.
 Thanks!
 >
 > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
 >>
 >> I think one build is enough, but haven't thought it through. The
 >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
 >> best advertised as a 'beta'. So maybe publish a no-hadoop build of
 it?
 >> Really, whatever's the easy thing to do.
 >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
 wrote:
 >> >
 >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
 Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
 hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
 Scala 2.12?
 >> >
 >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
 wrote:
 >> >>
 >> >> A few preliminary notes:
 >> >>
 >> >> Wenchen for some weird reason when I hit your key in gpg
 --import, it
 >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
 verify
 >> >> the signature. No issue there really.
 >> >>
 >> >> The staging repo gives a 404:
 >> >>
 https://repository.apache.org/content/repositories/orgapachespark-1285/
 >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
 >> >> [id=orgapachespark-1285] exists but is not exposed.
 >> >>
 >> >> The (revamped) licenses are OK, though there are some minor
 glitches
 >> >> in the final release tarballs (my fault) : there's an extra
 directory,
 >> >> and the source release has both binary and source licenses. I'll
 fix
 >> >> that. Not strictly necessary to reject the release over those.
 >> >>
 >> >> Last, when I check the staging repo I'll get my answer, but, were
 you
 >> >> able to build 2.12 artifacts as well?
 >> >>
 >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
 wrote:
 >> >> >
 >> >> > Please vote on releasing the following candidate as Apache
 Spark version 2.4.0.
 >> >> >
 >> >> > The vote is open until September 20 PST and passes if a
 majority +1 PMC votes are cast, with
 >> >> > a minimum of 3 +1 votes.
 >> >> >
 >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
 >> >> > [ ] -1 Do not release this package because ...
 >> >> >
 >> >> > To learn more about Apache Spark, please see
 http://spark.apache.org/
 >> >> >
 >> >> > The tag to be voted on is 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-18 Thread Marco Gaido
Sorry but I am -1 because of what was reported here:
https://issues.apache.org/jira/browse/SPARK-22036?focusedCommentId=16618104=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-16618104
.
It is a regression unfortunately. Despite the impact is not huge and there
are workarounds, I think we should include the fix in 2.4.0. I created
SPARK-25454 and submitted a PR for it.
Sorry for the trouble.

Il giorno mar 18 set 2018 alle ore 05:23 Holden Karau 
ha scritto:

> Deprecating Py 2 in the 2.4 release probably doesn't belong in the RC vote
> thread. Personally I think we might be a little too late in the game to
> deprecate it in 2.4, but I think calling it out as "soon to be deprecated"
> in the release docs would be sensible to give folks extra time to prepare.
>
> On Mon, Sep 17, 2018 at 2:04 PM Erik Erlandson 
> wrote:
>
>>
>> I have no binding vote but I second Stavros’ recommendation for
>> spark-23200
>>
>> Per parallel threads on Py2 support I would also like to propose
>> deprecating Py2 starting with this 2.4 release
>>
>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>  wrote:
>>
>>> You can log in to https://repository.apache.org and see what's wrong.
>>> Just find that staging repo and look at the messages. In your case it
>>> seems related to your signature.
>>>
>>> failureMessageNo public key: Key with id: () was not able to be
>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>> the operation again.
>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
>>> wrote:
>>> >
>>> > I confirmed that
>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
>>> error message during it.
>>> >
>>> > Any insights are appreciated! So that I can fix it in the next RC.
>>> Thanks!
>>> >
>>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>>> >>
>>> >> I think one build is enough, but haven't thought it through. The
>>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>>> >> Really, whatever's the easy thing to do.
>>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>>> wrote:
>>> >> >
>>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
>>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
>>> Scala 2.12?
>>> >> >
>>> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>>> wrote:
>>> >> >>
>>> >> >> A few preliminary notes:
>>> >> >>
>>> >> >> Wenchen for some weird reason when I hit your key in gpg --import,
>>> it
>>> >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
>>> verify
>>> >> >> the signature. No issue there really.
>>> >> >>
>>> >> >> The staging repo gives a 404:
>>> >> >>
>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>> >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>>> >> >> [id=orgapachespark-1285] exists but is not exposed.
>>> >> >>
>>> >> >> The (revamped) licenses are OK, though there are some minor
>>> glitches
>>> >> >> in the final release tarballs (my fault) : there's an extra
>>> directory,
>>> >> >> and the source release has both binary and source licenses. I'll
>>> fix
>>> >> >> that. Not strictly necessary to reject the release over those.
>>> >> >>
>>> >> >> Last, when I check the staging repo I'll get my answer, but, were
>>> you
>>> >> >> able to build 2.12 artifacts as well?
>>> >> >>
>>> >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>>> wrote:
>>> >> >> >
>>> >> >> > Please vote on releasing the following candidate as Apache Spark
>>> version 2.4.0.
>>> >> >> >
>>> >> >> > The vote is open until September 20 PST and passes if a majority
>>> +1 PMC votes are cast, with
>>> >> >> > a minimum of 3 +1 votes.
>>> >> >> >
>>> >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>>> >> >> > [ ] -1 Do not release this package because ...
>>> >> >> >
>>> >> >> > To learn more about Apache Spark, please see
>>> http://spark.apache.org/
>>> >> >> >
>>> >> >> > The tag to be voted on is v2.4.0-rc1 (commit
>>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>>> >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>>> >> >> >
>>> >> >> > The release files, including signatures, digests, etc. can be
>>> found at:
>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>>> >> >> >
>>> >> >> > Signatures used for Spark RCs can be found in this file:
>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>> >> >> >
>>> >> >> > The staging repository for this release can be found at:
>>> >> >> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>> >> >> >
>>> >> >> > The documentation corresponding to this release can be 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Holden Karau
Deprecating Py 2 in the 2.4 release probably doesn't belong in the RC vote
thread. Personally I think we might be a little too late in the game to
deprecate it in 2.4, but I think calling it out as "soon to be deprecated"
in the release docs would be sensible to give folks extra time to prepare.

On Mon, Sep 17, 2018 at 2:04 PM Erik Erlandson  wrote:

>
> I have no binding vote but I second Stavros’ recommendation for spark-23200
>
> Per parallel threads on Py2 support I would also like to propose
> deprecating Py2 starting with this 2.4 release
>
> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>  wrote:
>
>> You can log in to https://repository.apache.org and see what's wrong.
>> Just find that staging repo and look at the messages. In your case it
>> seems related to your signature.
>>
>> failureMessageNo public key: Key with id: () was not able to be
>> located on http://gpg-keyserver.de/. Upload your public key and try
>> the operation again.
>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan  wrote:
>> >
>> > I confirmed that
>> https://repository.apache.org/content/repositories/orgapachespark-1285
>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
>> error message during it.
>> >
>> > Any insights are appreciated! So that I can fix it in the next RC.
>> Thanks!
>> >
>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>> >>
>> >> I think one build is enough, but haven't thought it through. The
>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>> >> Really, whatever's the easy thing to do.
>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>> wrote:
>> >> >
>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
>> Scala 2.12?
>> >> >
>> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>> wrote:
>> >> >>
>> >> >> A few preliminary notes:
>> >> >>
>> >> >> Wenchen for some weird reason when I hit your key in gpg --import,
>> it
>> >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
>> verify
>> >> >> the signature. No issue there really.
>> >> >>
>> >> >> The staging repo gives a 404:
>> >> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>> >> >> [id=orgapachespark-1285] exists but is not exposed.
>> >> >>
>> >> >> The (revamped) licenses are OK, though there are some minor glitches
>> >> >> in the final release tarballs (my fault) : there's an extra
>> directory,
>> >> >> and the source release has both binary and source licenses. I'll fix
>> >> >> that. Not strictly necessary to reject the release over those.
>> >> >>
>> >> >> Last, when I check the staging repo I'll get my answer, but, were
>> you
>> >> >> able to build 2.12 artifacts as well?
>> >> >>
>> >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>> wrote:
>> >> >> >
>> >> >> > Please vote on releasing the following candidate as Apache Spark
>> version 2.4.0.
>> >> >> >
>> >> >> > The vote is open until September 20 PST and passes if a majority
>> +1 PMC votes are cast, with
>> >> >> > a minimum of 3 +1 votes.
>> >> >> >
>> >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>> >> >> > [ ] -1 Do not release this package because ...
>> >> >> >
>> >> >> > To learn more about Apache Spark, please see
>> http://spark.apache.org/
>> >> >> >
>> >> >> > The tag to be voted on is v2.4.0-rc1 (commit
>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>> >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>> >> >> >
>> >> >> > The release files, including signatures, digests, etc. can be
>> found at:
>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>> >> >> >
>> >> >> > Signatures used for Spark RCs can be found in this file:
>> >> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >> >> >
>> >> >> > The staging repository for this release can be found at:
>> >> >> >
>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> >> >
>> >> >> > The documentation corresponding to this release can be found at:
>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>> >> >> >
>> >> >> > The list of bug fixes going into 2.4.0 can be found at the
>> following URL:
>> >> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>> >> >> >
>> >> >> > FAQ
>> >> >> >
>> >> >> > =
>> >> >> > How can I help test this release?
>> >> >> > =
>> >> >> >
>> >> >> > If you are a Spark user, you can help us test this release by
>> taking
>> >> >> > an existing Spark workload and running on this release candidate,
>> then
>> >> >> > reporting any 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Saisai Shao
Hi Wenchen,

I think you need to set SPHINXPYTHON to python3 before building the docs,
to workaround the doc issue (
https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/_site/api/python/pyspark.ml.html#pyspark.ml.classification.LogisticRegression
).

Here is the notes for release page:


>- Ensure you have Python 3 having Sphinx installed, and SPHINXPYTHON 
> environment
>variable is set to indicate your Python 3 executable (see SPARK-24530).
>
>
Wenchen Fan  于2018年9月17日周一 上午10:48写道:

> Please vote on releasing the following candidate as Apache Spark version
> 2.4.0.
>
> The vote is open until September 20 PST and passes if a majority +1 PMC
> votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.0-rc1 (commit
> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
> https://github.com/apache/spark/tree/v2.4.0-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1285/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>
> The list of bug fixes going into 2.4.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.4.0?
> ===
>
> The current list of open tickets targeted at 2.4.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.
>


Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Stavros Kontopoulos
Hi Xiao,

I just tested it, it seems ok. There are some questions about which
properties we should keep when restoring the config. Otherwise it looks ok
to me.
The reason this should go in 2.4 is that streaming on k8s is something
people want to try day one (or at least it is cool to try) and since 2.4
comes with k8s support being refactored a lot,
it would be disappointing not to have it in...IMHO.

Best,
Stavros

On Mon, Sep 17, 2018 at 11:13 PM, Yinan Li  wrote:

> We can merge the PR and get SPARK-23200 resolved if the whole point is to
> make streaming on k8s work first. But given that this is not a blocker for
> 2.4, I think we can take a bit more time here and get it right. With that
> being said, I would expect it to be resolved soon.
>
> On Mon, Sep 17, 2018 at 11:47 AM Xiao Li  wrote:
>
>> Hi, Erik and Stavros,
>>
>> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds
>> important for the Streaming on K8S. Could the K8S oriented committers speed
>> up the reviews?
>>
>> Thanks,
>>
>> Xiao
>>
>> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>>
>>>
>>> I have no binding vote but I second Stavros’ recommendation for
>>> spark-23200
>>>
>>> Per parallel threads on Py2 support I would also like to propose
>>> deprecating Py2 starting with this 2.4 release
>>>
>>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>>  wrote:
>>>
 You can log in to https://repository.apache.org and see what's wrong.
 Just find that staging repo and look at the messages. In your case it
 seems related to your signature.

 failureMessageNo public key: Key with id: () was not able to be
 located on http://gpg-keyserver.de/. Upload your public key and try
 the operation again.
 On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
 wrote:
 >
 > I confirmed that https://repository.apache.org/content/repositories/
 orgapachespark-1285 is not accessible. I did it via
 ./dev/create-release/do-release-docker.sh -d /my/work/dir -s publish ,
 not sure what's going wrong. I didn't see any error message during it.
 >
 > Any insights are appreciated! So that I can fix it in the next RC.
 Thanks!
 >
 > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
 >>
 >> I think one build is enough, but haven't thought it through. The
 >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
 >> best advertised as a 'beta'. So maybe publish a no-hadoop build of
 it?
 >> Really, whatever's the easy thing to do.
 >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
 wrote:
 >> >
 >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
 Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
 hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
 Scala 2.12?
 >> >
 >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
 wrote:
 >> >>
 >> >> A few preliminary notes:
 >> >>
 >> >> Wenchen for some weird reason when I hit your key in gpg
 --import, it
 >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
 verify
 >> >> the signature. No issue there really.
 >> >>
 >> >> The staging repo gives a 404:
 >> >> https://repository.apache.org/content/repositories/
 orgapachespark-1285/
 >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
 >> >> [id=orgapachespark-1285] exists but is not exposed.
 >> >>
 >> >> The (revamped) licenses are OK, though there are some minor
 glitches
 >> >> in the final release tarballs (my fault) : there's an extra
 directory,
 >> >> and the source release has both binary and source licenses. I'll
 fix
 >> >> that. Not strictly necessary to reject the release over those.
 >> >>
 >> >> Last, when I check the staging repo I'll get my answer, but, were
 you
 >> >> able to build 2.12 artifacts as well?
 >> >>
 >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
 wrote:
 >> >> >
 >> >> > Please vote on releasing the following candidate as Apache
 Spark version 2.4.0.
 >> >> >
 >> >> > The vote is open until September 20 PST and passes if a
 majority +1 PMC votes are cast, with
 >> >> > a minimum of 3 +1 votes.
 >> >> >
 >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
 >> >> > [ ] -1 Do not release this package because ...
 >> >> >
 >> >> > To learn more about Apache Spark, please see
 http://spark.apache.org/
 >> >> >
 >> >> > The tag to be voted on is v2.4.0-rc1 (commit
 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
 >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
 >> >> >
 >> >> > The release files, including signatures, digests, etc. can be
 found at:
 >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
 >> >> >
 >> >> > Signatures used for Spark RCs can be found in 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Yinan Li
We can merge the PR and get SPARK-23200 resolved if the whole point is to
make streaming on k8s work first. But given that this is not a blocker for
2.4, I think we can take a bit more time here and get it right. With that
being said, I would expect it to be resolved soon.

On Mon, Sep 17, 2018 at 11:47 AM Xiao Li  wrote:

> Hi, Erik and Stavros,
>
> This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds
> important for the Streaming on K8S. Could the K8S oriented committers speed
> up the reviews?
>
> Thanks,
>
> Xiao
>
> Erik Erlandson  于2018年9月17日周一 上午11:04写道:
>
>>
>> I have no binding vote but I second Stavros’ recommendation for
>> spark-23200
>>
>> Per parallel threads on Py2 support I would also like to propose
>> deprecating Py2 starting with this 2.4 release
>>
>> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>>  wrote:
>>
>>> You can log in to https://repository.apache.org and see what's wrong.
>>> Just find that staging repo and look at the messages. In your case it
>>> seems related to your signature.
>>>
>>> failureMessageNo public key: Key with id: () was not able to be
>>> located on http://gpg-keyserver.de/. Upload your public key and try
>>> the operation again.
>>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan 
>>> wrote:
>>> >
>>> > I confirmed that
>>> https://repository.apache.org/content/repositories/orgapachespark-1285
>>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
>>> error message during it.
>>> >
>>> > Any insights are appreciated! So that I can fix it in the next RC.
>>> Thanks!
>>> >
>>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>>> >>
>>> >> I think one build is enough, but haven't thought it through. The
>>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>>> >> Really, whatever's the easy thing to do.
>>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>>> wrote:
>>> >> >
>>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
>>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
>>> Scala 2.12?
>>> >> >
>>> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>>> wrote:
>>> >> >>
>>> >> >> A few preliminary notes:
>>> >> >>
>>> >> >> Wenchen for some weird reason when I hit your key in gpg --import,
>>> it
>>> >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
>>> verify
>>> >> >> the signature. No issue there really.
>>> >> >>
>>> >> >> The staging repo gives a 404:
>>> >> >>
>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>> >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>>> >> >> [id=orgapachespark-1285] exists but is not exposed.
>>> >> >>
>>> >> >> The (revamped) licenses are OK, though there are some minor
>>> glitches
>>> >> >> in the final release tarballs (my fault) : there's an extra
>>> directory,
>>> >> >> and the source release has both binary and source licenses. I'll
>>> fix
>>> >> >> that. Not strictly necessary to reject the release over those.
>>> >> >>
>>> >> >> Last, when I check the staging repo I'll get my answer, but, were
>>> you
>>> >> >> able to build 2.12 artifacts as well?
>>> >> >>
>>> >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>>> wrote:
>>> >> >> >
>>> >> >> > Please vote on releasing the following candidate as Apache Spark
>>> version 2.4.0.
>>> >> >> >
>>> >> >> > The vote is open until September 20 PST and passes if a majority
>>> +1 PMC votes are cast, with
>>> >> >> > a minimum of 3 +1 votes.
>>> >> >> >
>>> >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>>> >> >> > [ ] -1 Do not release this package because ...
>>> >> >> >
>>> >> >> > To learn more about Apache Spark, please see
>>> http://spark.apache.org/
>>> >> >> >
>>> >> >> > The tag to be voted on is v2.4.0-rc1 (commit
>>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>>> >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>>> >> >> >
>>> >> >> > The release files, including signatures, digests, etc. can be
>>> found at:
>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>>> >> >> >
>>> >> >> > Signatures used for Spark RCs can be found in this file:
>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>> >> >> >
>>> >> >> > The staging repository for this release can be found at:
>>> >> >> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>> >> >> >
>>> >> >> > The documentation corresponding to this release can be found at:
>>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>>> >> >> >
>>> >> >> > The list of bug fixes going into 2.4.0 can be found at the
>>> following URL:
>>> >> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>>> >> >> >

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Xiao Li
Hi, Erik and Stavros,

This bug fix SPARK-23200 is not a blocker of the 2.4 release. It sounds
important for the Streaming on K8S. Could the K8S oriented committers speed
up the reviews?

Thanks,

Xiao

Erik Erlandson  于2018年9月17日周一 上午11:04写道:

>
> I have no binding vote but I second Stavros’ recommendation for spark-23200
>
> Per parallel threads on Py2 support I would also like to propose
> deprecating Py2 starting with this 2.4 release
>
> On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin
>  wrote:
>
>> You can log in to https://repository.apache.org and see what's wrong.
>> Just find that staging repo and look at the messages. In your case it
>> seems related to your signature.
>>
>> failureMessageNo public key: Key with id: () was not able to be
>> located on http://gpg-keyserver.de/. Upload your public key and try
>> the operation again.
>> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan  wrote:
>> >
>> > I confirmed that
>> https://repository.apache.org/content/repositories/orgapachespark-1285
>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see any
>> error message during it.
>> >
>> > Any insights are appreciated! So that I can fix it in the next RC.
>> Thanks!
>> >
>> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>> >>
>> >> I think one build is enough, but haven't thought it through. The
>> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>> >> Really, whatever's the easy thing to do.
>> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>> wrote:
>> >> >
>> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
>> Scala 2.12?
>> >> >
>> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen 
>> wrote:
>> >> >>
>> >> >> A few preliminary notes:
>> >> >>
>> >> >> Wenchen for some weird reason when I hit your key in gpg --import,
>> it
>> >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
>> verify
>> >> >> the signature. No issue there really.
>> >> >>
>> >> >> The staging repo gives a 404:
>> >> >>
>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>> >> >> [id=orgapachespark-1285] exists but is not exposed.
>> >> >>
>> >> >> The (revamped) licenses are OK, though there are some minor glitches
>> >> >> in the final release tarballs (my fault) : there's an extra
>> directory,
>> >> >> and the source release has both binary and source licenses. I'll fix
>> >> >> that. Not strictly necessary to reject the release over those.
>> >> >>
>> >> >> Last, when I check the staging repo I'll get my answer, but, were
>> you
>> >> >> able to build 2.12 artifacts as well?
>> >> >>
>> >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>> wrote:
>> >> >> >
>> >> >> > Please vote on releasing the following candidate as Apache Spark
>> version 2.4.0.
>> >> >> >
>> >> >> > The vote is open until September 20 PST and passes if a majority
>> +1 PMC votes are cast, with
>> >> >> > a minimum of 3 +1 votes.
>> >> >> >
>> >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>> >> >> > [ ] -1 Do not release this package because ...
>> >> >> >
>> >> >> > To learn more about Apache Spark, please see
>> http://spark.apache.org/
>> >> >> >
>> >> >> > The tag to be voted on is v2.4.0-rc1 (commit
>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>> >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>> >> >> >
>> >> >> > The release files, including signatures, digests, etc. can be
>> found at:
>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>> >> >> >
>> >> >> > Signatures used for Spark RCs can be found in this file:
>> >> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >> >> >
>> >> >> > The staging repository for this release can be found at:
>> >> >> >
>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> >> >
>> >> >> > The documentation corresponding to this release can be found at:
>> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>> >> >> >
>> >> >> > The list of bug fixes going into 2.4.0 can be found at the
>> following URL:
>> >> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>> >> >> >
>> >> >> > FAQ
>> >> >> >
>> >> >> > =
>> >> >> > How can I help test this release?
>> >> >> > =
>> >> >> >
>> >> >> > If you are a Spark user, you can help us test this release by
>> taking
>> >> >> > an existing Spark workload and running on this release candidate,
>> then
>> >> >> > reporting any regressions.
>> >> >> >
>> >> >> > If you're working in PySpark you can set up a virtual env and
>> install
>> >> >> 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Erik Erlandson
I have no binding vote but I second Stavros’ recommendation for spark-23200

Per parallel threads on Py2 support I would also like to propose
deprecating Py2 starting with this 2.4 release

On Mon, Sep 17, 2018 at 10:38 AM Marcelo Vanzin 
wrote:

> You can log in to https://repository.apache.org and see what's wrong.
> Just find that staging repo and look at the messages. In your case it
> seems related to your signature.
>
> failureMessageNo public key: Key with id: () was not able to be
> located on http://gpg-keyserver.de/. Upload your public key and try
> the operation again.
> On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan  wrote:
> >
> > I confirmed that
> https://repository.apache.org/content/repositories/orgapachespark-1285 is
> not accessible. I did it via ./dev/create-release/do-release-docker.sh -d
> /my/work/dir -s publish , not sure what's going wrong. I didn't see any
> error message during it.
> >
> > Any insights are appreciated! So that I can fix it in the next RC.
> Thanks!
> >
> > On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
> >>
> >> I think one build is enough, but haven't thought it through. The
> >> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
> >> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
> >> Really, whatever's the easy thing to do.
> >> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
> wrote:
> >> >
> >> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
> Scala 2.12?
> >> >
> >> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
> >> >>
> >> >> A few preliminary notes:
> >> >>
> >> >> Wenchen for some weird reason when I hit your key in gpg --import, it
> >> >> asks for a passphrase. When I skip it, it's fine, gpg can still
> verify
> >> >> the signature. No issue there really.
> >> >>
> >> >> The staging repo gives a 404:
> >> >>
> https://repository.apache.org/content/repositories/orgapachespark-1285/
> >> >> 404 - Repository "orgapachespark-1285 (staging: open)"
> >> >> [id=orgapachespark-1285] exists but is not exposed.
> >> >>
> >> >> The (revamped) licenses are OK, though there are some minor glitches
> >> >> in the final release tarballs (my fault) : there's an extra
> directory,
> >> >> and the source release has both binary and source licenses. I'll fix
> >> >> that. Not strictly necessary to reject the release over those.
> >> >>
> >> >> Last, when I check the staging repo I'll get my answer, but, were you
> >> >> able to build 2.12 artifacts as well?
> >> >>
> >> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
> wrote:
> >> >> >
> >> >> > Please vote on releasing the following candidate as Apache Spark
> version 2.4.0.
> >> >> >
> >> >> > The vote is open until September 20 PST and passes if a majority
> +1 PMC votes are cast, with
> >> >> > a minimum of 3 +1 votes.
> >> >> >
> >> >> > [ ] +1 Release this package as Apache Spark 2.4.0
> >> >> > [ ] -1 Do not release this package because ...
> >> >> >
> >> >> > To learn more about Apache Spark, please see
> http://spark.apache.org/
> >> >> >
> >> >> > The tag to be voted on is v2.4.0-rc1 (commit
> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
> >> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
> >> >> >
> >> >> > The release files, including signatures, digests, etc. can be
> found at:
> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
> >> >> >
> >> >> > Signatures used for Spark RCs can be found in this file:
> >> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> >> >
> >> >> > The staging repository for this release can be found at:
> >> >> >
> https://repository.apache.org/content/repositories/orgapachespark-1285/
> >> >> >
> >> >> > The documentation corresponding to this release can be found at:
> >> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
> >> >> >
> >> >> > The list of bug fixes going into 2.4.0 can be found at the
> following URL:
> >> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
> >> >> >
> >> >> > FAQ
> >> >> >
> >> >> > =
> >> >> > How can I help test this release?
> >> >> > =
> >> >> >
> >> >> > If you are a Spark user, you can help us test this release by
> taking
> >> >> > an existing Spark workload and running on this release candidate,
> then
> >> >> > reporting any regressions.
> >> >> >
> >> >> > If you're working in PySpark you can set up a virtual env and
> install
> >> >> > the current RC and see if anything important breaks, in the
> Java/Scala
> >> >> > you can add the staging repository to your projects resolvers and
> test
> >> >> > with the RC (make sure to clean up the artifact cache before/after
> so
> >> >> > you don't end up building with a out of date RC going forward).
> >> >> >
> >> >> > ===

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Marcelo Vanzin
You can log in to https://repository.apache.org and see what's wrong.
Just find that staging repo and look at the messages. In your case it
seems related to your signature.

failureMessageNo public key: Key with id: () was not able to be
located on http://gpg-keyserver.de/. Upload your public key and try
the operation again.
On Sun, Sep 16, 2018 at 10:00 PM Wenchen Fan  wrote:
>
> I confirmed that 
> https://repository.apache.org/content/repositories/orgapachespark-1285 is not 
> accessible. I did it via ./dev/create-release/do-release-docker.sh -d 
> /my/work/dir -s publish , not sure what's going wrong. I didn't see any error 
> message during it.
>
> Any insights are appreciated! So that I can fix it in the next RC. Thanks!
>
> On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>>
>> I think one build is enough, but haven't thought it through. The
>> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>> Really, whatever's the easy thing to do.
>> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan  wrote:
>> >
>> > Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala 
>> > 2.12 build this time? Current for Scala 2.11 we have 3 builds: with hadoop 
>> > 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for Scala 
>> > 2.12?
>> >
>> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
>> >>
>> >> A few preliminary notes:
>> >>
>> >> Wenchen for some weird reason when I hit your key in gpg --import, it
>> >> asks for a passphrase. When I skip it, it's fine, gpg can still verify
>> >> the signature. No issue there really.
>> >>
>> >> The staging repo gives a 404:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>> >> [id=orgapachespark-1285] exists but is not exposed.
>> >>
>> >> The (revamped) licenses are OK, though there are some minor glitches
>> >> in the final release tarballs (my fault) : there's an extra directory,
>> >> and the source release has both binary and source licenses. I'll fix
>> >> that. Not strictly necessary to reject the release over those.
>> >>
>> >> Last, when I check the staging repo I'll get my answer, but, were you
>> >> able to build 2.12 artifacts as well?
>> >>
>> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan  wrote:
>> >> >
>> >> > Please vote on releasing the following candidate as Apache Spark 
>> >> > version 2.4.0.
>> >> >
>> >> > The vote is open until September 20 PST and passes if a majority +1 PMC 
>> >> > votes are cast, with
>> >> > a minimum of 3 +1 votes.
>> >> >
>> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>> >> > [ ] -1 Do not release this package because ...
>> >> >
>> >> > To learn more about Apache Spark, please see http://spark.apache.org/
>> >> >
>> >> > The tag to be voted on is v2.4.0-rc1 (commit 
>> >> > 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>> >> >
>> >> > The release files, including signatures, digests, etc. can be found at:
>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>> >> >
>> >> > Signatures used for Spark RCs can be found in this file:
>> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >> >
>> >> > The staging repository for this release can be found at:
>> >> > https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >> >
>> >> > The documentation corresponding to this release can be found at:
>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>> >> >
>> >> > The list of bug fixes going into 2.4.0 can be found at the following 
>> >> > URL:
>> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>> >> >
>> >> > FAQ
>> >> >
>> >> > =
>> >> > How can I help test this release?
>> >> > =
>> >> >
>> >> > If you are a Spark user, you can help us test this release by taking
>> >> > an existing Spark workload and running on this release candidate, then
>> >> > reporting any regressions.
>> >> >
>> >> > If you're working in PySpark you can set up a virtual env and install
>> >> > the current RC and see if anything important breaks, in the Java/Scala
>> >> > you can add the staging repository to your projects resolvers and test
>> >> > with the RC (make sure to clean up the artifact cache before/after so
>> >> > you don't end up building with a out of date RC going forward).
>> >> >
>> >> > ===
>> >> > What should happen to JIRA tickets still targeting 2.4.0?
>> >> > ===
>> >> >
>> >> > The current list of open tickets targeted at 2.4.0 can be found at:
>> >> > https://issues.apache.org/jira/projects/SPARK and search for "Target 
>> >> > Version/s" = 2.4.0
>> >> >
>> >> > Committers should look at those and triage. Extremely important bug
>> >> > fixes, 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Stavros Kontopoulos
I just follow the comment Wehnchen Fan (of course it is not merged yet, but
I wanted to bring this to the attention of the dev list)

"We should definitely merge it to branch 2.4, but I won't block the release
since it's not that critical and it's still in progress. After it's merged,
feel free to vote -1 on the RC voting email to include this change, if
necessary."


So if the vote is not valid, we can ignore it. But this should have been
in, before 2.4 was cut IMHO, anyway.


Stavros


On Mon, Sep 17, 2018 at 4:53 PM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> I believe -1 votes are merited only for correctness bugs and regressions
> since the previous release.
>
> Does SPARK-23200 count as either?
>
> 2018년 9월 17일 (월) 오전 9:40, Stavros Kontopoulos  lightbend.com>님이 작성:
>
>> -1
>>
>> I would like to see: https://github.com/apache/spark/pull/22392 in, as
>> discussed here: https://issues.apache.org/jira/browse/SPARK-23200. It is
>> important IMHO for streaming on K8s.
>> I just started testing it btw.
>>
>> Also 2.12.7(https://contributors.scala-lang.org/t/2-12-7-release/2301,
>> https://github.com/scala/scala/milestone/73 is coming out (will be
>> staged this week), do we want to build the beta 2.12 build against it?
>>
>> Stavros
>>
>> On Mon, Sep 17, 2018 at 8:00 AM, Wenchen Fan  wrote:
>>
>>> I confirmed that https://repository.apache.org/content/
>>> repositories/orgapachespark-1285 is not accessible. I did it via
>>> ./dev/create-release/do-release-docker.sh -d /my/work/dir -s publish ,
>>> not sure what's going wrong. I didn't see any error message during it.
>>>
>>> Any insights are appreciated! So that I can fix it in the next RC.
>>> Thanks!
>>>
>>> On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>>>
 I think one build is enough, but haven't thought it through. The
 Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
 best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
 Really, whatever's the easy thing to do.
 On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
 wrote:
 >
 > Ah I missed the Scala 2.12 build. Do you mean we should publish a
 Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
 hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
 Scala 2.12?
 >
 > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
 >>
 >> A few preliminary notes:
 >>
 >> Wenchen for some weird reason when I hit your key in gpg --import, it
 >> asks for a passphrase. When I skip it, it's fine, gpg can still
 verify
 >> the signature. No issue there really.
 >>
 >> The staging repo gives a 404:
 >> https://repository.apache.org/content/repositories/
 orgapachespark-1285/
 >> 404 - Repository "orgapachespark-1285 (staging: open)"
 >> [id=orgapachespark-1285] exists but is not exposed.
 >>
 >> The (revamped) licenses are OK, though there are some minor glitches
 >> in the final release tarballs (my fault) : there's an extra
 directory,
 >> and the source release has both binary and source licenses. I'll fix
 >> that. Not strictly necessary to reject the release over those.
 >>
 >> Last, when I check the staging repo I'll get my answer, but, were you
 >> able to build 2.12 artifacts as well?
 >>
 >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
 wrote:
 >> >
 >> > Please vote on releasing the following candidate as Apache Spark
 version 2.4.0.
 >> >
 >> > The vote is open until September 20 PST and passes if a majority
 +1 PMC votes are cast, with
 >> > a minimum of 3 +1 votes.
 >> >
 >> > [ ] +1 Release this package as Apache Spark 2.4.0
 >> > [ ] -1 Do not release this package because ...
 >> >
 >> > To learn more about Apache Spark, please see
 http://spark.apache.org/
 >> >
 >> > The tag to be voted on is v2.4.0-rc1 (commit
 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
 >> > https://github.com/apache/spark/tree/v2.4.0-rc1
 >> >
 >> > The release files, including signatures, digests, etc. can be
 found at:
 >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
 >> >
 >> > Signatures used for Spark RCs can be found in this file:
 >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
 >> >
 >> > The staging repository for this release can be found at:
 >> > https://repository.apache.org/content/repositories/
 orgapachespark-1285/
 >> >
 >> > The documentation corresponding to this release can be found at:
 >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
 >> >
 >> > The list of bug fixes going into 2.4.0 can be found at the
 following URL:
 >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
 >> >
 >> > FAQ
 >> >
 >> > =
 >> > How can I help test this 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Nicholas Chammas
I believe -1 votes are merited only for correctness bugs and regressions
since the previous release.

Does SPARK-23200 count as either?

2018년 9월 17일 (월) 오전 9:40, Stavros Kontopoulos <
stavros.kontopou...@lightbend.com>님이 작성:

> -1
>
> I would like to see: https://github.com/apache/spark/pull/22392 in, as
> discussed here: https://issues.apache.org/jira/browse/SPARK-23200. It is
> important IMHO for streaming on K8s.
> I just started testing it btw.
>
> Also 2.12.7(https://contributors.scala-lang.org/t/2-12-7-release/2301,
> https://github.com/scala/scala/milestone/73 is coming out (will be staged
> this week), do we want to build the beta 2.12 build against it?
>
> Stavros
>
> On Mon, Sep 17, 2018 at 8:00 AM, Wenchen Fan  wrote:
>
>> I confirmed that
>> https://repository.apache.org/content/repositories/orgapachespark-1285
>> is not accessible. I did it via ./dev/create-release/do-release-docker.sh
>> -d /my/work/dir -s publish , not sure what's going wrong. I didn't see
>> any error message during it.
>>
>> Any insights are appreciated! So that I can fix it in the next RC. Thanks!
>>
>> On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>>
>>> I think one build is enough, but haven't thought it through. The
>>> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>>> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>>> Really, whatever's the easy thing to do.
>>> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan 
>>> wrote:
>>> >
>>> > Ah I missed the Scala 2.12 build. Do you mean we should publish a
>>> Scala 2.12 build this time? Current for Scala 2.11 we have 3 builds: with
>>> hadoop 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for
>>> Scala 2.12?
>>> >
>>> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
>>> >>
>>> >> A few preliminary notes:
>>> >>
>>> >> Wenchen for some weird reason when I hit your key in gpg --import, it
>>> >> asks for a passphrase. When I skip it, it's fine, gpg can still verify
>>> >> the signature. No issue there really.
>>> >>
>>> >> The staging repo gives a 404:
>>> >>
>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>>> >> [id=orgapachespark-1285] exists but is not exposed.
>>> >>
>>> >> The (revamped) licenses are OK, though there are some minor glitches
>>> >> in the final release tarballs (my fault) : there's an extra directory,
>>> >> and the source release has both binary and source licenses. I'll fix
>>> >> that. Not strictly necessary to reject the release over those.
>>> >>
>>> >> Last, when I check the staging repo I'll get my answer, but, were you
>>> >> able to build 2.12 artifacts as well?
>>> >>
>>> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>>> wrote:
>>> >> >
>>> >> > Please vote on releasing the following candidate as Apache Spark
>>> version 2.4.0.
>>> >> >
>>> >> > The vote is open until September 20 PST and passes if a majority +1
>>> PMC votes are cast, with
>>> >> > a minimum of 3 +1 votes.
>>> >> >
>>> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>>> >> > [ ] -1 Do not release this package because ...
>>> >> >
>>> >> > To learn more about Apache Spark, please see
>>> http://spark.apache.org/
>>> >> >
>>> >> > The tag to be voted on is v2.4.0-rc1 (commit
>>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>>> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>>> >> >
>>> >> > The release files, including signatures, digests, etc. can be found
>>> at:
>>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>>> >> >
>>> >> > Signatures used for Spark RCs can be found in this file:
>>> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>> >> >
>>> >> > The staging repository for this release can be found at:
>>> >> >
>>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>>> >> >
>>> >> > The documentation corresponding to this release can be found at:
>>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>>> >> >
>>> >> > The list of bug fixes going into 2.4.0 can be found at the
>>> following URL:
>>> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>>> >> >
>>> >> > FAQ
>>> >> >
>>> >> > =
>>> >> > How can I help test this release?
>>> >> > =
>>> >> >
>>> >> > If you are a Spark user, you can help us test this release by taking
>>> >> > an existing Spark workload and running on this release candidate,
>>> then
>>> >> > reporting any regressions.
>>> >> >
>>> >> > If you're working in PySpark you can set up a virtual env and
>>> install
>>> >> > the current RC and see if anything important breaks, in the
>>> Java/Scala
>>> >> > you can add the staging repository to your projects resolvers and
>>> test
>>> >> > with the RC (make sure to clean up the artifact cache before/after
>>> so
>>> >> > you don't end up building with a out of date RC going forward).
>>> >> >
>>> >> > 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-17 Thread Stavros Kontopoulos
-1

I would like to see: https://github.com/apache/spark/pull/22392 in, as
discussed here: https://issues.apache.org/jira/browse/SPARK-23200. It is
important IMHO for streaming on K8s.
I just started testing it btw.

Also 2.12.7(https://contributors.scala-lang.org/t/2-12-7-release/2301,
https://github.com/scala/scala/milestone/73 is coming out (will be staged
this week), do we want to build the beta 2.12 build against it?

Stavros

On Mon, Sep 17, 2018 at 8:00 AM, Wenchen Fan  wrote:

> I confirmed that https://repository.apache.org/content/
> repositories/orgapachespark-1285 is not accessible. I did it via
> ./dev/create-release/do-release-docker.sh -d /my/work/dir -s publish ,
> not sure what's going wrong. I didn't see any error message during it.
>
> Any insights are appreciated! So that I can fix it in the next RC. Thanks!
>
> On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:
>
>> I think one build is enough, but haven't thought it through. The
>> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
>> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
>> Really, whatever's the easy thing to do.
>> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan  wrote:
>> >
>> > Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala
>> 2.12 build this time? Current for Scala 2.11 we have 3 builds: with hadoop
>> 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for Scala
>> 2.12?
>> >
>> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
>> >>
>> >> A few preliminary notes:
>> >>
>> >> Wenchen for some weird reason when I hit your key in gpg --import, it
>> >> asks for a passphrase. When I skip it, it's fine, gpg can still verify
>> >> the signature. No issue there really.
>> >>
>> >> The staging repo gives a 404:
>> >> https://repository.apache.org/content/repositories/
>> orgapachespark-1285/
>> >> 404 - Repository "orgapachespark-1285 (staging: open)"
>> >> [id=orgapachespark-1285] exists but is not exposed.
>> >>
>> >> The (revamped) licenses are OK, though there are some minor glitches
>> >> in the final release tarballs (my fault) : there's an extra directory,
>> >> and the source release has both binary and source licenses. I'll fix
>> >> that. Not strictly necessary to reject the release over those.
>> >>
>> >> Last, when I check the staging repo I'll get my answer, but, were you
>> >> able to build 2.12 artifacts as well?
>> >>
>> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
>> wrote:
>> >> >
>> >> > Please vote on releasing the following candidate as Apache Spark
>> version 2.4.0.
>> >> >
>> >> > The vote is open until September 20 PST and passes if a majority +1
>> PMC votes are cast, with
>> >> > a minimum of 3 +1 votes.
>> >> >
>> >> > [ ] +1 Release this package as Apache Spark 2.4.0
>> >> > [ ] -1 Do not release this package because ...
>> >> >
>> >> > To learn more about Apache Spark, please see
>> http://spark.apache.org/
>> >> >
>> >> > The tag to be voted on is v2.4.0-rc1 (commit
>> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
>> >> >
>> >> > The release files, including signatures, digests, etc. can be found
>> at:
>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>> >> >
>> >> > Signatures used for Spark RCs can be found in this file:
>> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >> >
>> >> > The staging repository for this release can be found at:
>> >> > https://repository.apache.org/content/repositories/
>> orgapachespark-1285/
>> >> >
>> >> > The documentation corresponding to this release can be found at:
>> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>> >> >
>> >> > The list of bug fixes going into 2.4.0 can be found at the following
>> URL:
>> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>> >> >
>> >> > FAQ
>> >> >
>> >> > =
>> >> > How can I help test this release?
>> >> > =
>> >> >
>> >> > If you are a Spark user, you can help us test this release by taking
>> >> > an existing Spark workload and running on this release candidate,
>> then
>> >> > reporting any regressions.
>> >> >
>> >> > If you're working in PySpark you can set up a virtual env and install
>> >> > the current RC and see if anything important breaks, in the
>> Java/Scala
>> >> > you can add the staging repository to your projects resolvers and
>> test
>> >> > with the RC (make sure to clean up the artifact cache before/after so
>> >> > you don't end up building with a out of date RC going forward).
>> >> >
>> >> > ===
>> >> > What should happen to JIRA tickets still targeting 2.4.0?
>> >> > ===
>> >> >
>> >> > The current list of open tickets targeted at 2.4.0 can be found at:
>> >> > https://issues.apache.org/jira/projects/SPARK and search for
>> "Target Version/s" = 2.4.0
>> >> >
>> >> > 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-16 Thread Wenchen Fan
I confirmed that
https://repository.apache.org/content/repositories/orgapachespark-1285 is
not accessible. I did it via ./dev/create-release/do-release-docker.sh -d
/my/work/dir -s publish , not sure what's going wrong. I didn't see any
error message during it.

Any insights are appreciated! So that I can fix it in the next RC. Thanks!

On Mon, Sep 17, 2018 at 11:31 AM Sean Owen  wrote:

> I think one build is enough, but haven't thought it through. The
> Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
> best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
> Really, whatever's the easy thing to do.
> On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan  wrote:
> >
> > Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala
> 2.12 build this time? Current for Scala 2.11 we have 3 builds: with hadoop
> 2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for Scala
> 2.12?
> >
> > On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
> >>
> >> A few preliminary notes:
> >>
> >> Wenchen for some weird reason when I hit your key in gpg --import, it
> >> asks for a passphrase. When I skip it, it's fine, gpg can still verify
> >> the signature. No issue there really.
> >>
> >> The staging repo gives a 404:
> >> https://repository.apache.org/content/repositories/orgapachespark-1285/
> >> 404 - Repository "orgapachespark-1285 (staging: open)"
> >> [id=orgapachespark-1285] exists but is not exposed.
> >>
> >> The (revamped) licenses are OK, though there are some minor glitches
> >> in the final release tarballs (my fault) : there's an extra directory,
> >> and the source release has both binary and source licenses. I'll fix
> >> that. Not strictly necessary to reject the release over those.
> >>
> >> Last, when I check the staging repo I'll get my answer, but, were you
> >> able to build 2.12 artifacts as well?
> >>
> >> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan 
> wrote:
> >> >
> >> > Please vote on releasing the following candidate as Apache Spark
> version 2.4.0.
> >> >
> >> > The vote is open until September 20 PST and passes if a majority +1
> PMC votes are cast, with
> >> > a minimum of 3 +1 votes.
> >> >
> >> > [ ] +1 Release this package as Apache Spark 2.4.0
> >> > [ ] -1 Do not release this package because ...
> >> >
> >> > To learn more about Apache Spark, please see http://spark.apache.org/
> >> >
> >> > The tag to be voted on is v2.4.0-rc1 (commit
> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
> >> > https://github.com/apache/spark/tree/v2.4.0-rc1
> >> >
> >> > The release files, including signatures, digests, etc. can be found
> at:
> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
> >> >
> >> > Signatures used for Spark RCs can be found in this file:
> >> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> >
> >> > The staging repository for this release can be found at:
> >> >
> https://repository.apache.org/content/repositories/orgapachespark-1285/
> >> >
> >> > The documentation corresponding to this release can be found at:
> >> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
> >> >
> >> > The list of bug fixes going into 2.4.0 can be found at the following
> URL:
> >> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
> >> >
> >> > FAQ
> >> >
> >> > =
> >> > How can I help test this release?
> >> > =
> >> >
> >> > If you are a Spark user, you can help us test this release by taking
> >> > an existing Spark workload and running on this release candidate, then
> >> > reporting any regressions.
> >> >
> >> > If you're working in PySpark you can set up a virtual env and install
> >> > the current RC and see if anything important breaks, in the Java/Scala
> >> > you can add the staging repository to your projects resolvers and test
> >> > with the RC (make sure to clean up the artifact cache before/after so
> >> > you don't end up building with a out of date RC going forward).
> >> >
> >> > ===
> >> > What should happen to JIRA tickets still targeting 2.4.0?
> >> > ===
> >> >
> >> > The current list of open tickets targeted at 2.4.0 can be found at:
> >> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.0
> >> >
> >> > Committers should look at those and triage. Extremely important bug
> >> > fixes, documentation, and API tweaks that impact compatibility should
> >> > be worked on immediately. Everything else please retarget to an
> >> > appropriate release.
> >> >
> >> > ==
> >> > But my bug isn't fixed?
> >> > ==
> >> >
> >> > In order to make timely releases, we will typically not hold the
> >> > release unless the bug in question is a regression from the previous
> >> > release. That being said, if there is something which is a regression
> >> > that has not been correctly targeted please ping 

Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-16 Thread Sean Owen
I think one build is enough, but haven't thought it through. The
Hadoop 2.6/2.7 builds are already nearly redundant. 2.12 is probably
best advertised as a 'beta'. So maybe publish a no-hadoop build of it?
Really, whatever's the easy thing to do.
On Sun, Sep 16, 2018 at 10:28 PM Wenchen Fan  wrote:
>
> Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala 2.12 
> build this time? Current for Scala 2.11 we have 3 builds: with hadoop 2.7, 
> with hadoop 2.6, without hadoop. Shall we do the same thing for Scala 2.12?
>
> On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:
>>
>> A few preliminary notes:
>>
>> Wenchen for some weird reason when I hit your key in gpg --import, it
>> asks for a passphrase. When I skip it, it's fine, gpg can still verify
>> the signature. No issue there really.
>>
>> The staging repo gives a 404:
>> https://repository.apache.org/content/repositories/orgapachespark-1285/
>> 404 - Repository "orgapachespark-1285 (staging: open)"
>> [id=orgapachespark-1285] exists but is not exposed.
>>
>> The (revamped) licenses are OK, though there are some minor glitches
>> in the final release tarballs (my fault) : there's an extra directory,
>> and the source release has both binary and source licenses. I'll fix
>> that. Not strictly necessary to reject the release over those.
>>
>> Last, when I check the staging repo I'll get my answer, but, were you
>> able to build 2.12 artifacts as well?
>>
>> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan  wrote:
>> >
>> > Please vote on releasing the following candidate as Apache Spark version 
>> > 2.4.0.
>> >
>> > The vote is open until September 20 PST and passes if a majority +1 PMC 
>> > votes are cast, with
>> > a minimum of 3 +1 votes.
>> >
>> > [ ] +1 Release this package as Apache Spark 2.4.0
>> > [ ] -1 Do not release this package because ...
>> >
>> > To learn more about Apache Spark, please see http://spark.apache.org/
>> >
>> > The tag to be voted on is v2.4.0-rc1 (commit 
>> > 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
>> > https://github.com/apache/spark/tree/v2.4.0-rc1
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>> >
>> > Signatures used for Spark RCs can be found in this file:
>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >
>> > The staging repository for this release can be found at:
>> > https://repository.apache.org/content/repositories/orgapachespark-1285/
>> >
>> > The documentation corresponding to this release can be found at:
>> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>> >
>> > The list of bug fixes going into 2.4.0 can be found at the following URL:
>> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>> >
>> > FAQ
>> >
>> > =
>> > How can I help test this release?
>> > =
>> >
>> > If you are a Spark user, you can help us test this release by taking
>> > an existing Spark workload and running on this release candidate, then
>> > reporting any regressions.
>> >
>> > If you're working in PySpark you can set up a virtual env and install
>> > the current RC and see if anything important breaks, in the Java/Scala
>> > you can add the staging repository to your projects resolvers and test
>> > with the RC (make sure to clean up the artifact cache before/after so
>> > you don't end up building with a out of date RC going forward).
>> >
>> > ===
>> > What should happen to JIRA tickets still targeting 2.4.0?
>> > ===
>> >
>> > The current list of open tickets targeted at 2.4.0 can be found at:
>> > https://issues.apache.org/jira/projects/SPARK and search for "Target 
>> > Version/s" = 2.4.0
>> >
>> > Committers should look at those and triage. Extremely important bug
>> > fixes, documentation, and API tweaks that impact compatibility should
>> > be worked on immediately. Everything else please retarget to an
>> > appropriate release.
>> >
>> > ==
>> > But my bug isn't fixed?
>> > ==
>> >
>> > In order to make timely releases, we will typically not hold the
>> > release unless the bug in question is a regression from the previous
>> > release. That being said, if there is something which is a regression
>> > that has not been correctly targeted please ping me or a committer to
>> > help target the issue.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-16 Thread Wenchen Fan
Ah I missed the Scala 2.12 build. Do you mean we should publish a Scala
2.12 build this time? Current for Scala 2.11 we have 3 builds: with hadoop
2.7, with hadoop 2.6, without hadoop. Shall we do the same thing for Scala
2.12?

On Mon, Sep 17, 2018 at 11:14 AM Sean Owen  wrote:

> A few preliminary notes:
>
> Wenchen for some weird reason when I hit your key in gpg --import, it
> asks for a passphrase. When I skip it, it's fine, gpg can still verify
> the signature. No issue there really.
>
> The staging repo gives a 404:
> https://repository.apache.org/content/repositories/orgapachespark-1285/
> 404 - Repository "orgapachespark-1285 (staging: open)"
> [id=orgapachespark-1285] exists but is not exposed.
>
> The (revamped) licenses are OK, though there are some minor glitches
> in the final release tarballs (my fault) : there's an extra directory,
> and the source release has both binary and source licenses. I'll fix
> that. Not strictly necessary to reject the release over those.
>
> Last, when I check the staging repo I'll get my answer, but, were you
> able to build 2.12 artifacts as well?
>
> On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan  wrote:
> >
> > Please vote on releasing the following candidate as Apache Spark version
> 2.4.0.
> >
> > The vote is open until September 20 PST and passes if a majority +1 PMC
> votes are cast, with
> > a minimum of 3 +1 votes.
> >
> > [ ] +1 Release this package as Apache Spark 2.4.0
> > [ ] -1 Do not release this package because ...
> >
> > To learn more about Apache Spark, please see http://spark.apache.org/
> >
> > The tag to be voted on is v2.4.0-rc1 (commit
> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
> > https://github.com/apache/spark/tree/v2.4.0-rc1
> >
> > The release files, including signatures, digests, etc. can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
> >
> > Signatures used for Spark RCs can be found in this file:
> > https://dist.apache.org/repos/dist/dev/spark/KEYS
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1285/
> >
> > The documentation corresponding to this release can be found at:
> > https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
> >
> > The list of bug fixes going into 2.4.0 can be found at the following URL:
> > https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
> >
> > FAQ
> >
> > =
> > How can I help test this release?
> > =
> >
> > If you are a Spark user, you can help us test this release by taking
> > an existing Spark workload and running on this release candidate, then
> > reporting any regressions.
> >
> > If you're working in PySpark you can set up a virtual env and install
> > the current RC and see if anything important breaks, in the Java/Scala
> > you can add the staging repository to your projects resolvers and test
> > with the RC (make sure to clean up the artifact cache before/after so
> > you don't end up building with a out of date RC going forward).
> >
> > ===
> > What should happen to JIRA tickets still targeting 2.4.0?
> > ===
> >
> > The current list of open tickets targeted at 2.4.0 can be found at:
> > https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 2.4.0
> >
> > Committers should look at those and triage. Extremely important bug
> > fixes, documentation, and API tweaks that impact compatibility should
> > be worked on immediately. Everything else please retarget to an
> > appropriate release.
> >
> > ==
> > But my bug isn't fixed?
> > ==
> >
> > In order to make timely releases, we will typically not hold the
> > release unless the bug in question is a regression from the previous
> > release. That being said, if there is something which is a regression
> > that has not been correctly targeted please ping me or a committer to
> > help target the issue.
>


Re: [VOTE] SPARK 2.4.0 (RC1)

2018-09-16 Thread Sean Owen
A few preliminary notes:

Wenchen for some weird reason when I hit your key in gpg --import, it
asks for a passphrase. When I skip it, it's fine, gpg can still verify
the signature. No issue there really.

The staging repo gives a 404:
https://repository.apache.org/content/repositories/orgapachespark-1285/
404 - Repository "orgapachespark-1285 (staging: open)"
[id=orgapachespark-1285] exists but is not exposed.

The (revamped) licenses are OK, though there are some minor glitches
in the final release tarballs (my fault) : there's an extra directory,
and the source release has both binary and source licenses. I'll fix
that. Not strictly necessary to reject the release over those.

Last, when I check the staging repo I'll get my answer, but, were you
able to build 2.12 artifacts as well?

On Sun, Sep 16, 2018 at 9:48 PM Wenchen Fan  wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 
> 2.4.0.
>
> The vote is open until September 20 PST and passes if a majority +1 PMC votes 
> are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.0
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.0-rc1 (commit 
> 1220ab8a0738b5f67dc522df5e3e77ffc83d207a):
> https://github.com/apache/spark/tree/v2.4.0-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1285/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.0-rc1-docs/
>
> The list of bug fixes going into 2.4.0 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/2.4.0
>
> FAQ
>
> =
> How can I help test this release?
> =
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===
> What should happen to JIRA tickets still targeting 2.4.0?
> ===
>
> The current list of open tickets targeted at 2.4.0 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target 
> Version/s" = 2.4.0
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==
> But my bug isn't fixed?
> ==
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org