Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-21 Thread Marcelo Vanzin
While you're at it, one thing that needs to be done is create a 2.1.3
version on JIRA. Not sure if you have enough permissions to do that.

Fixes after an RC should use the new version, and if you create a new
RC, you'll need to go and backdate the patches that went into the new
RC.

On Mon, Sep 18, 2017 at 8:22 PM, Holden Karau <hol...@pigscanfly.ca> wrote:
> As per the conversation happening around the signing of releases I'm
> cancelling this vote. If folks agree with the temporary solution there I'll
> try and get a new RC out shortly but if we end up blocking on migrating the
> Jenkins jobs it could take a bit longer.
>
> On Sun, Sep 17, 2017 at 1:30 AM, yuming wang <wgy...@gmail.com> wrote:
>>
>> Yes, It doesn’t work in 2.1.0 and 2.1.1, I create a PR for this:
>> https://github.com/apache/spark/pull/19259.
>>
>>
>> 在 2017年9月17日,16:14,Sean Owen <so...@cloudera.com> 写道:
>>
>> So, didn't work in 2.1.0 or 2.1.1? If it's not a regression and not
>> critical, it shouldn't block a release. It seems like this can only affect
>> Docker and/or Oracle JDBC? Well, if we need to roll another release anyway,
>> seems OK.
>>
>> On Sun, Sep 17, 2017 at 6:06 AM Xiao Li <gatorsm...@gmail.com> wrote:
>>>
>>> This is a bug introduced in 2.1. It works fine in 2.0
>>>
>>> 2017-09-16 16:15 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:
>>>>
>>>> Ok :) Was this working in 2.1.1?
>>>>
>>>> On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com> wrote:
>>>>>
>>>>> Still -1
>>>>>
>>>>> Unable to pass the tests in my local environment. Open a JIRA
>>>>> https://issues.apache.org/jira/browse/SPARK-22041
>>>>>
>>>>> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
>>>>>
>>>>>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
>>>>> (OracleIntegrationSuite.scala:158)
>>>>>
>>>>> Xiao
>>>>>
>>>>>
>>>>> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:
>>>>>>
>>>>>> -1 (with my Apache member hat on, non-binding)
>>>>>>
>>>>>> I'll continue discussion in the other thread, but I don't think we
>>>>>> should share signing keys.
>>>>>>
>>>>>> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>>> wrote:
>>>>>>>
>>>>>>> Indeed it's limited to a people with login permissions on the Jenkins
>>>>>>> host (and perhaps further limited, I'm not certain). Shane probably 
>>>>>>> knows
>>>>>>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>>>>>>
>>>>>>> This is maybe branching a bit from the question of the current RC
>>>>>>> though, so I'd suggest we continue this discussion on the thread Sean 
>>>>>>> Owen
>>>>>>> made.
>>>>>>>
>>>>>>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>>>>>>>
>>>>>>>> I'm not familiar with the release procedure, can you send a link to
>>>>>>>> this Jenkins job? Can anyone run this job, or is it limited to 
>>>>>>>> committers?
>>>>>>>>
>>>>>>>> rb
>>>>>>>>
>>>>>>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau
>>>>>>>> <hol...@pigscanfly.ca> wrote:
>>>>>>>>>
>>>>>>>>> That's a good question, I built the release candidate however the
>>>>>>>>> Jenkins scripts don't take a parameter for configuring who signs them 
>>>>>>>>> rather
>>>>>>>>> it always signs them with Patrick's key. You can see this from 
>>>>>>>>> previous
>>>>>>>>> releases which were managed by other folks but still signed by 
>>>>>>>>> Patrick.
>>>>>>>>>
>>>>>>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com>
>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>> The sig

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-18 Thread Holden Karau
As per the conversation happening around the signing of releases I'm
cancelling this vote. If folks agree with the temporary solution there I'll
try and get a new RC out shortly but if we end up blocking on migrating the
Jenkins jobs it could take a bit longer.

On Sun, Sep 17, 2017 at 1:30 AM, yuming wang <wgy...@gmail.com> wrote:

> Yes, It doesn’t work in 2.1.0 and 2.1.1, I create a PR for this:
> https://github.com/apache/spark/pull/19259.
>
>
> 在 2017年9月17日,16:14,Sean Owen <so...@cloudera.com> 写道:
>
> So, didn't work in 2.1.0 or 2.1.1? If it's not a regression and not
> critical, it shouldn't block a release. It seems like this can only affect
> Docker and/or Oracle JDBC? Well, if we need to roll another release anyway,
> seems OK.
>
> On Sun, Sep 17, 2017 at 6:06 AM Xiao Li <gatorsm...@gmail.com> wrote:
>
>> This is a bug introduced in 2.1. It works fine in 2.0
>>
>> 2017-09-16 16:15 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:
>>
>>> Ok :) Was this working in 2.1.1?
>>>
>>> On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com> wrote:
>>>
>>>> Still -1
>>>>
>>>> Unable to pass the tests in my local environment. Open a JIRA
>>>> https://issues.apache.org/jira/browse/SPARK-22041
>>>>
>>>> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
>>>>
>>>>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
>>>> (OracleIntegrationSuite.scala:158)
>>>>
>>>> Xiao
>>>>
>>>> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:
>>>>
>>>>> -1 (with my Apache member hat on, non-binding)
>>>>>
>>>>> I'll continue discussion in the other thread, but I don't think we
>>>>> should share signing keys.
>>>>>
>>>>> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> Indeed it's limited to a people with login permissions on the Jenkins
>>>>>> host (and perhaps further limited, I'm not certain). Shane probably knows
>>>>>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>>>>>
>>>>>> This is maybe branching a bit from the question of the current RC
>>>>>> though, so I'd suggest we continue this discussion on the thread Sean 
>>>>>> Owen
>>>>>> made.
>>>>>>
>>>>>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>>>>>
>>>>>>> I'm not familiar with the release procedure, can you send a link to
>>>>>>> this Jenkins job? Can anyone run this job, or is it limited to 
>>>>>>> committers?
>>>>>>>
>>>>>>> rb
>>>>>>>
>>>>>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> That's a good question, I built the release candidate however the
>>>>>>>> Jenkins scripts don't take a parameter for configuring who signs them
>>>>>>>> rather it always signs them with Patrick's key. You can see this from
>>>>>>>> previous releases which were managed by other folks but still signed by
>>>>>>>> Patrick.
>>>>>>>>
>>>>>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> The signature is valid, but why was the release signed with
>>>>>>>>> Patrick Wendell's private key? Did Patrick build the release 
>>>>>>>>> candidate?
>>>>>>>>>
>>>>>>>>> rb
>>>>>>>>>
>>>>>>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>>> +1 (non-binding)
>>>>>>>>>>
>>>>>>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>>>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>>>>>
>>>>>>>>&

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-17 Thread yuming wang
Yes, It doesn’t work in 2.1.0 and 2.1.1, I create a PR for this: 
https://github.com/apache/spark/pull/19259 
<https://github.com/apache/spark/pull/19259>.


> 在 2017年9月17日,16:14,Sean Owen <so...@cloudera.com> 写道:
> 
> So, didn't work in 2.1.0 or 2.1.1? If it's not a regression and not critical, 
> it shouldn't block a release. It seems like this can only affect Docker 
> and/or Oracle JDBC? Well, if we need to roll another release anyway, seems OK.
> 
> On Sun, Sep 17, 2017 at 6:06 AM Xiao Li <gatorsm...@gmail.com 
> <mailto:gatorsm...@gmail.com>> wrote:
> This is a bug introduced in 2.1. It works fine in 2.0
> 
> 2017-09-16 16:15 GMT-07:00 Holden Karau <hol...@pigscanfly.ca 
> <mailto:hol...@pigscanfly.ca>>:
> Ok :) Was this working in 2.1.1?
> 
> On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com 
> <mailto:gatorsm...@gmail.com>> wrote:
> Still -1
> 
> Unable to pass the tests in my local environment. Open a JIRA 
> https://issues.apache.org/jira/browse/SPARK-22041 
> <https://issues.apache.org/jira/browse/SPARK-22041>
> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
> 
>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false 
> (OracleIntegrationSuite.scala:158)
> 
> Xiao
> 
> 
> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid 
> <mailto:rb...@netflix.com.invalid>>:
> -1 (with my Apache member hat on, non-binding)
> 
> I'll continue discussion in the other thread, but I don't think we should 
> share signing keys.
> 
> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca 
> <mailto:hol...@pigscanfly.ca>> wrote:
> Indeed it's limited to a people with login permissions on the Jenkins host 
> (and perhaps further limited, I'm not certain). Shane probably knows more 
> about the ACLs, so I'll ask him in the other thread for specifics.
> 
> This is maybe branching a bit from the question of the current RC though, so 
> I'd suggest we continue this discussion on the thread Sean Owen made.
> 
> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com 
> <mailto:rb...@netflix.com>> wrote:
> I'm not familiar with the release procedure, can you send a link to this 
> Jenkins job? Can anyone run this job, or is it limited to committers?
> 
> rb
> 
> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca 
> <mailto:hol...@pigscanfly.ca>> wrote:
> That's a good question, I built the release candidate however the Jenkins 
> scripts don't take a parameter for configuring who signs them rather it 
> always signs them with Patrick's key. You can see this from previous releases 
> which were managed by other folks but still signed by Patrick.
> 
> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com 
> <mailto:rb...@netflix.com>> wrote:
> The signature is valid, but why was the release signed with Patrick Wendell's 
> private key? Did Patrick build the release candidate?
> 
> rb
> 
> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com 
> <mailto:denny.g@gmail.com>> wrote:
> +1 (non-binding)
> 
> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com 
> <mailto:felixcheun...@hotmail.com>> wrote:
> +1 tested SparkR package on Windows, r-hub, Ubuntu.
> 
> _
> From: Sean Owen <so...@cloudera.com <mailto:so...@cloudera.com>>
> Sent: Thursday, September 14, 2017 3:12 PM
> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
> To: Holden Karau <hol...@pigscanfly.ca <mailto:hol...@pigscanfly.ca>>, 
> <dev@spark.apache.org <mailto:dev@spark.apache.org>>
> 
> 
> 
> +1
> Very nice. The sigs and hashes look fine, it builds fine for me on Debian 
> Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests. 
> 
> Yes as you say, no outstanding issues except for this which doesn't look 
> critical, as it's not a regression.
> 
> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
> 
> 
> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca 
> <mailto:hol...@pigscanfly.ca>> wrote:
> Please vote on releasing the following candidate as Apache Spark version 
> 2.1.2. The vote is open until Friday September 22nd at 18:00 PST and passes 
> if a majority of at least 3 +1 PMC votes are cast.
> 
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
> 
> 
> To learn more about Apache Spark, please see https://spark.apache.org/ 
> <https://spark.apache.org/>
> 
> The tag to be voted on is v2.1.2

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-17 Thread Sean Owen
So, didn't work in 2.1.0 or 2.1.1? If it's not a regression and not
critical, it shouldn't block a release. It seems like this can only affect
Docker and/or Oracle JDBC? Well, if we need to roll another release anyway,
seems OK.

On Sun, Sep 17, 2017 at 6:06 AM Xiao Li <gatorsm...@gmail.com> wrote:

> This is a bug introduced in 2.1. It works fine in 2.0
>
> 2017-09-16 16:15 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:
>
>> Ok :) Was this working in 2.1.1?
>>
>> On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com> wrote:
>>
>>> Still -1
>>>
>>> Unable to pass the tests in my local environment. Open a JIRA
>>> https://issues.apache.org/jira/browse/SPARK-22041
>>>
>>> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
>>>
>>>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
>>> (OracleIntegrationSuite.scala:158)
>>>
>>> Xiao
>>>
>>> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:
>>>
>>>> -1 (with my Apache member hat on, non-binding)
>>>>
>>>> I'll continue discussion in the other thread, but I don't think we
>>>> should share signing keys.
>>>>
>>>> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
>>>> wrote:
>>>>
>>>>> Indeed it's limited to a people with login permissions on the Jenkins
>>>>> host (and perhaps further limited, I'm not certain). Shane probably knows
>>>>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>>>>
>>>>> This is maybe branching a bit from the question of the current RC
>>>>> though, so I'd suggest we continue this discussion on the thread Sean Owen
>>>>> made.
>>>>>
>>>>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>>>>
>>>>>> I'm not familiar with the release procedure, can you send a link to
>>>>>> this Jenkins job? Can anyone run this job, or is it limited to 
>>>>>> committers?
>>>>>>
>>>>>> rb
>>>>>>
>>>>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>>> wrote:
>>>>>>
>>>>>>> That's a good question, I built the release candidate however the
>>>>>>> Jenkins scripts don't take a parameter for configuring who signs them
>>>>>>> rather it always signs them with Patrick's key. You can see this from
>>>>>>> previous releases which were managed by other folks but still signed by
>>>>>>> Patrick.
>>>>>>>
>>>>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> The signature is valid, but why was the release signed with Patrick
>>>>>>>> Wendell's private key? Did Patrick build the release candidate?
>>>>>>>>
>>>>>>>> rb
>>>>>>>>
>>>>>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> +1 (non-binding)
>>>>>>>>>
>>>>>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>>>>>>
>>>>>>>>>> _
>>>>>>>>>> From: Sean Owen <so...@cloudera.com>
>>>>>>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>>>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>>>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> +1
>>>>>>>>>> Very nice. The sigs and hashes look fine, it builds fine for me
>>>>>>>>>> on Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and 
>>>>>>>

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-16 Thread Xiao Li
This is a bug introduced in 2.1. It works fine in 2.0

2017-09-16 16:15 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:

> Ok :) Was this working in 2.1.1?
>
> On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com> wrote:
>
>> Still -1
>>
>> Unable to pass the tests in my local environment. Open a JIRA
>> https://issues.apache.org/jira/browse/SPARK-22041
>>
>> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
>>
>>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
>> (OracleIntegrationSuite.scala:158)
>>
>> Xiao
>>
>> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:
>>
>>> -1 (with my Apache member hat on, non-binding)
>>>
>>> I'll continue discussion in the other thread, but I don't think we
>>> should share signing keys.
>>>
>>> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
>>> wrote:
>>>
>>>> Indeed it's limited to a people with login permissions on the Jenkins
>>>> host (and perhaps further limited, I'm not certain). Shane probably knows
>>>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>>>
>>>> This is maybe branching a bit from the question of the current RC
>>>> though, so I'd suggest we continue this discussion on the thread Sean Owen
>>>> made.
>>>>
>>>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>>>
>>>>> I'm not familiar with the release procedure, can you send a link to
>>>>> this Jenkins job? Can anyone run this job, or is it limited to committers?
>>>>>
>>>>> rb
>>>>>
>>>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> That's a good question, I built the release candidate however the
>>>>>> Jenkins scripts don't take a parameter for configuring who signs them
>>>>>> rather it always signs them with Patrick's key. You can see this from
>>>>>> previous releases which were managed by other folks but still signed by
>>>>>> Patrick.
>>>>>>
>>>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com>
>>>>>> wrote:
>>>>>>
>>>>>>> The signature is valid, but why was the release signed with Patrick
>>>>>>> Wendell's private key? Did Patrick build the release candidate?
>>>>>>>
>>>>>>> rb
>>>>>>>
>>>>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> +1 (non-binding)
>>>>>>>>
>>>>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>>>
>>>>>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>>>>>
>>>>>>>>> _
>>>>>>>>> From: Sean Owen <so...@cloudera.com>
>>>>>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> +1
>>>>>>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>>>>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>>>>>>> tests.
>>>>>>>>>
>>>>>>>>> Yes as you say, no outstanding issues except for this which
>>>>>>>>> doesn't look critical, as it's not a regression.
>>>>>>>>>
>>>>>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped
>>>>>>>>> RDDs
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>>>>>>> wrote:
>>>>

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-16 Thread Holden Karau
Ok :) Was this working in 2.1.1?

On Sat, Sep 16, 2017 at 3:59 PM Xiao Li <gatorsm...@gmail.com> wrote:

> Still -1
>
> Unable to pass the tests in my local environment. Open a JIRA
> https://issues.apache.org/jira/browse/SPARK-22041
>
> - SPARK-16625: General data types to be mapped to Oracle *** FAILED ***
>
>   types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
> (OracleIntegrationSuite.scala:158)
>
> Xiao
>
> 2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:
>
>> -1 (with my Apache member hat on, non-binding)
>>
>> I'll continue discussion in the other thread, but I don't think we should
>> share signing keys.
>>
>> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
>> wrote:
>>
>>> Indeed it's limited to a people with login permissions on the Jenkins
>>> host (and perhaps further limited, I'm not certain). Shane probably knows
>>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>>
>>> This is maybe branching a bit from the question of the current RC
>>> though, so I'd suggest we continue this discussion on the thread Sean Owen
>>> made.
>>>
>>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>>
>>>> I'm not familiar with the release procedure, can you send a link to
>>>> this Jenkins job? Can anyone run this job, or is it limited to committers?
>>>>
>>>> rb
>>>>
>>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca>
>>>> wrote:
>>>>
>>>>> That's a good question, I built the release candidate however the
>>>>> Jenkins scripts don't take a parameter for configuring who signs them
>>>>> rather it always signs them with Patrick's key. You can see this from
>>>>> previous releases which were managed by other folks but still signed by
>>>>> Patrick.
>>>>>
>>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>>>>>
>>>>>> The signature is valid, but why was the release signed with Patrick
>>>>>> Wendell's private key? Did Patrick build the release candidate?
>>>>>>
>>>>>> rb
>>>>>>
>>>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>>> +1 (non-binding)
>>>>>>>
>>>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>>
>>>>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>>>>
>>>>>>>> _
>>>>>>>> From: Sean Owen <so...@cloudera.com>
>>>>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> +1
>>>>>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>>>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>>>>>> tests.
>>>>>>>>
>>>>>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>>>>>> look critical, as it's not a regression.
>>>>>>>>
>>>>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped
>>>>>>>> RDDs
>>>>>>>>
>>>>>>>>
>>>>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>>> version 2.1.2. The vote is open until Friday September 22nd at
>>>>>>>>> 18:00 PST and passes if a majority of at least 3 +1 PMC votes are
>>>>>>>>> cast.
>>>>>>>>>
>>>>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>>>&g

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-16 Thread Xiao Li
Still -1

Unable to pass the tests in my local environment. Open a JIRA
https://issues.apache.org/jira/browse/SPARK-22041

- SPARK-16625: General data types to be mapped to Oracle *** FAILED ***

  types.apply(9).equals(org.apache.spark.sql.types.DateType) was false
(OracleIntegrationSuite.scala:158)

Xiao

2017-09-15 17:35 GMT-07:00 Ryan Blue <rb...@netflix.com.invalid>:

> -1 (with my Apache member hat on, non-binding)
>
> I'll continue discussion in the other thread, but I don't think we should
> share signing keys.
>
> On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca>
> wrote:
>
>> Indeed it's limited to a people with login permissions on the Jenkins
>> host (and perhaps further limited, I'm not certain). Shane probably knows
>> more about the ACLs, so I'll ask him in the other thread for specifics.
>>
>> This is maybe branching a bit from the question of the current RC though,
>> so I'd suggest we continue this discussion on the thread Sean Owen made.
>>
>> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>>
>>> I'm not familiar with the release procedure, can you send a link to this
>>> Jenkins job? Can anyone run this job, or is it limited to committers?
>>>
>>> rb
>>>
>>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca>
>>> wrote:
>>>
>>>> That's a good question, I built the release candidate however the
>>>> Jenkins scripts don't take a parameter for configuring who signs them
>>>> rather it always signs them with Patrick's key. You can see this from
>>>> previous releases which were managed by other folks but still signed by
>>>> Patrick.
>>>>
>>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>>>>
>>>>> The signature is valid, but why was the release signed with Patrick
>>>>> Wendell's private key? Did Patrick build the release candidate?
>>>>>
>>>>> rb
>>>>>
>>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> +1 (non-binding)
>>>>>>
>>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>>>> felixcheun...@hotmail.com> wrote:
>>>>>>
>>>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>>>
>>>>>>> _
>>>>>>> From: Sean Owen <so...@cloudera.com>
>>>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> +1
>>>>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>>>>> tests.
>>>>>>>
>>>>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>>>>> look critical, as it's not a regression.
>>>>>>>
>>>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>>>>>
>>>>>>>
>>>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>> version 2.1.2. The vote is open until Friday September 22nd at
>>>>>>>> 18:00 PST and passes if a majority of at least 3 +1 PMC votes are
>>>>>>>> cast.
>>>>>>>>
>>>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>
>>>>>>>>
>>>>>>>> To learn more about Apache Spark, please see
>>>>>>>> https://spark.apache.org/
>>>>>>>>
>>>>>>>> The tag to be voted on is v2.1.2-rc1
>>>>>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>>>>>>> 6999dd36cb33f528afe627

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Ryan Blue
-1 (with my Apache member hat on, non-binding)

I'll continue discussion in the other thread, but I don't think we should
share signing keys.

On Fri, Sep 15, 2017 at 5:14 PM, Holden Karau <hol...@pigscanfly.ca> wrote:

> Indeed it's limited to a people with login permissions on the Jenkins host
> (and perhaps further limited, I'm not certain). Shane probably knows more
> about the ACLs, so I'll ask him in the other thread for specifics.
>
> This is maybe branching a bit from the question of the current RC though,
> so I'd suggest we continue this discussion on the thread Sean Owen made.
>
> On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:
>
>> I'm not familiar with the release procedure, can you send a link to this
>> Jenkins job? Can anyone run this job, or is it limited to committers?
>>
>> rb
>>
>> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca>
>> wrote:
>>
>>> That's a good question, I built the release candidate however the
>>> Jenkins scripts don't take a parameter for configuring who signs them
>>> rather it always signs them with Patrick's key. You can see this from
>>> previous releases which were managed by other folks but still signed by
>>> Patrick.
>>>
>>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>>>
>>>> The signature is valid, but why was the release signed with Patrick
>>>> Wendell's private key? Did Patrick build the release candidate?
>>>>
>>>> rb
>>>>
>>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>>> wrote:
>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>>> felixcheun...@hotmail.com> wrote:
>>>>>
>>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>>
>>>>>> _
>>>>>> From: Sean Owen <so...@cloudera.com>
>>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>>
>>>>>>
>>>>>>
>>>>>> +1
>>>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>>>> tests.
>>>>>>
>>>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>>>> look critical, as it's not a regression.
>>>>>>
>>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>>>>
>>>>>>
>>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>>>> wrote:
>>>>>>
>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>> version 2.1.2. The vote is open until Friday September 22nd at
>>>>>>> 18:00 PST and passes if a majority of at least 3 +1 PMC votes are
>>>>>>> cast.
>>>>>>>
>>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>
>>>>>>>
>>>>>>> To learn more about Apache Spark, please see
>>>>>>> https://spark.apache.org/
>>>>>>>
>>>>>>> The tag to be voted on is v2.1.2-rc1
>>>>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>>>>>> 6999dd36cb33f528afe627c12)
>>>>>>>
>>>>>>> List of JIRA tickets resolved in this release can be found with
>>>>>>> this filter.
>>>>>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>>>>>
>>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>>> at:
>>>>>>> https://home.apache.org/~pwendell/spark-releases/spark-
>>>>>>> 2.1.2-rc1-bin/
>>>>>>>
>>>>>>> Release artifacts are signed with the following key:
>>>&g

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Holden Karau
Indeed it's limited to a people with login permissions on the Jenkins host
(and perhaps further limited, I'm not certain). Shane probably knows more
about the ACLs, so I'll ask him in the other thread for specifics.

This is maybe branching a bit from the question of the current RC though,
so I'd suggest we continue this discussion on the thread Sean Owen made.

On Fri, Sep 15, 2017 at 4:04 PM Ryan Blue <rb...@netflix.com> wrote:

> I'm not familiar with the release procedure, can you send a link to this
> Jenkins job? Can anyone run this job, or is it limited to committers?
>
> rb
>
> On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca>
> wrote:
>
>> That's a good question, I built the release candidate however the Jenkins
>> scripts don't take a parameter for configuring who signs them rather it
>> always signs them with Patrick's key. You can see this from previous
>> releases which were managed by other folks but still signed by Patrick.
>>
>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>>
>>> The signature is valid, but why was the release signed with Patrick
>>> Wendell's private key? Did Patrick build the release candidate?
>>>
>>> rb
>>>
>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>> felixcheun...@hotmail.com> wrote:
>>>>
>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>
>>>>> _
>>>>> From: Sean Owen <so...@cloudera.com>
>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>
>>>>>
>>>>>
>>>>> +1
>>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>>> tests.
>>>>>
>>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>>> look critical, as it's not a regression.
>>>>>
>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>>>
>>>>>
>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>>>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>>
>>>>>> To learn more about Apache Spark, please see
>>>>>> https://spark.apache.org/
>>>>>>
>>>>>> The tag to be voted on is v2.1.2-rc1
>>>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (
>>>>>> 6f470323a0363656999dd36cb33f528afe627c12)
>>>>>>
>>>>>> List of JIRA tickets resolved in this release can be found with this
>>>>>> filter.
>>>>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>>>>
>>>>>> Release artifacts are signed with the following key:
>>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>>
>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>>
>>>>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>>>>>
>>>>>>
>>>>>> *FAQ*
>>>>>>
>>>&

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Ryan Blue
I'm not familiar with the release procedure, can you send a link to this
Jenkins job? Can anyone run this job, or is it limited to committers?

rb

On Fri, Sep 15, 2017 at 12:28 PM, Holden Karau <hol...@pigscanfly.ca> wrote:

> That's a good question, I built the release candidate however the Jenkins
> scripts don't take a parameter for configuring who signs them rather it
> always signs them with Patrick's key. You can see this from previous
> releases which were managed by other folks but still signed by Patrick.
>
> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>
>> The signature is valid, but why was the release signed with Patrick
>> Wendell's private key? Did Patrick build the release candidate?
>>
>> rb
>>
>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com>
>>> wrote:
>>>
>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>
>>>> _
>>>> From: Sean Owen <so...@cloudera.com>
>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>
>>>>
>>>>
>>>> +1
>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>> tests.
>>>>
>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>> look critical, as it's not a regression.
>>>>
>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>>
>>>>
>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>> wrote:
>>>>
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>>
>>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>>
>>>>> The tag to be voted on is v2.1.2-rc1
>>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>>>> 6999dd36cb33f528afe627c12)
>>>>>
>>>>> List of JIRA tickets resolved in this release can be found with this
>>>>> filter.
>>>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>>>
>>>>> Release artifacts are signed with the following key:
>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>> spark-1248/
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1
>>>>> .2-rc1-docs/
>>>>>
>>>>>
>>>>> *FAQ*
>>>>>
>>>>> *How can I help test this release?*
>>>>>
>>>>> If you are a Spark user, you can help us test this release by taking
>>>>> an existing Spark workload and running on this release candidate, then
>>>>> reporting any regressions.
>>>>>
>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>> can add the staging repository to your projects resolvers and test with 
>>>>> the
>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>> up building with a out of date RC going forward).
>>>>>
>>>

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Felix Cheung
Yes ;)


From: Xiao Li <gatorsm...@gmail.com>
Sent: Friday, September 15, 2017 2:22:03 PM
To: Holden Karau
Cc: Ryan Blue; Denny Lee; Felix Cheung; Sean Owen; dev@spark.apache.org
Subject: Re: [VOTE] Spark 2.1.2 (RC1)

Sorry, this release candidate is 2.1.2. The issue is in 2.2.1.

2017-09-15 14:21 GMT-07:00 Xiao Li 
<gatorsm...@gmail.com<mailto:gatorsm...@gmail.com>>:
-1

See the discussion in https://github.com/apache/spark/pull/19074

Xiao



2017-09-15 12:28 GMT-07:00 Holden Karau 
<hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>>:
That's a good question, I built the release candidate however the Jenkins 
scripts don't take a parameter for configuring who signs them rather it always 
signs them with Patrick's key. You can see this from previous releases which 
were managed by other folks but still signed by Patrick.

On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue 
<rb...@netflix.com<mailto:rb...@netflix.com>> wrote:
The signature is valid, but why was the release signed with Patrick Wendell's 
private key? Did Patrick build the release candidate?

rb

On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee 
<denny.g@gmail.com<mailto:denny.g@gmail.com>> wrote:
+1 (non-binding)

On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung 
<felixcheun...@hotmail.com<mailto:felixcheun...@hotmail.com>> wrote:
+1 tested SparkR package on Windows, r-hub, Ubuntu.

_
From: Sean Owen <so...@cloudera.com<mailto:so...@cloudera.com>>
Sent: Thursday, September 14, 2017 3:12 PM
Subject: Re: [VOTE] Spark 2.1.2 (RC1)
To: Holden Karau <hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>>, 
<dev@spark.apache.org<mailto:dev@spark.apache.org>>



+1
Very nice. The sigs and hashes look fine, it builds fine for me on Debian 
Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests.

Yes as you say, no outstanding issues except for this which doesn't look 
critical, as it's not a regression.

SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs


On Thu, Sep 14, 2017 at 7:47 PM Holden Karau 
<hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.1.2. 
The vote is open until Friday September 22nd at 18:00 PST and passes if a 
majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is 
v2.1.2-rc1<https://github.com/apache/spark/tree/v2.1.2-rc1> 
(6f470323a0363656999dd36cb33f528afe627c12)

List of JIRA tickets resolved in this release can be found with this 
filter.<https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>

The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/

Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1248/

The documentation corresponding to this release can be found at:
https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then reporting 
any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can add 
the staging repository to your projects resolvers and test with the RC (make 
sure to clean up the artifact cache before/after so you don't end up building 
with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.1.2?

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked on 
immediately. Everything else please retarget to 2.1.3.

But my bug isn't fixed!??!

In order to make timely releases, we will typically not hold the release unless 
the bug in question is a regression from 2.1.1. That being said if there is 
something which is a regression form 2.1.1 that has not been correctly targeted 
please ping a committer to help target the issue (you can see the open issues 
listed as impacting Spark 2.1.1 & 
2.1.2<https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20%3D%20OPEN%20AND%20(affectedVersion%20%3D%202.1.2%20OR%20affectedVersion%20%3D%202.1.1)>)

What are the unresolved issues targeted for 
2.1.2<https://issues.apache.org/jira/browse/SPARK-21985?jql

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Xiao Li
Sorry, this release candidate is 2.1.2. The issue is in 2.2.1.

2017-09-15 14:21 GMT-07:00 Xiao Li <gatorsm...@gmail.com>:

> -1
>
> See the discussion in https://github.com/apache/spark/pull/19074
>
> Xiao
>
>
>
> 2017-09-15 12:28 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:
>
>> That's a good question, I built the release candidate however the Jenkins
>> scripts don't take a parameter for configuring who signs them rather it
>> always signs them with Patrick's key. You can see this from previous
>> releases which were managed by other folks but still signed by Patrick.
>>
>> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>>
>>> The signature is valid, but why was the release signed with Patrick
>>> Wendell's private key? Did Patrick build the release candidate?
>>>
>>> rb
>>>
>>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com>
>>> wrote:
>>>
>>>> +1 (non-binding)
>>>>
>>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <
>>>> felixcheun...@hotmail.com> wrote:
>>>>
>>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>>
>>>>> _
>>>>> From: Sean Owen <so...@cloudera.com>
>>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>>
>>>>>
>>>>>
>>>>> +1
>>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>>> tests.
>>>>>
>>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>>> look critical, as it's not a regression.
>>>>>
>>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>>>
>>>>>
>>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>>> wrote:
>>>>>
>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>>>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>>>>
>>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>>> [ ] -1 Do not release this package because ...
>>>>>>
>>>>>>
>>>>>> To learn more about Apache Spark, please see
>>>>>> https://spark.apache.org/
>>>>>>
>>>>>> The tag to be voted on is v2.1.2-rc1
>>>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>>>>> 6999dd36cb33f528afe627c12)
>>>>>>
>>>>>> List of JIRA tickets resolved in this release can be found with this
>>>>>> filter.
>>>>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>>>>
>>>>>> The release files, including signatures, digests, etc. can be found
>>>>>> at:
>>>>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>>>>
>>>>>> Release artifacts are signed with the following key:
>>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>>
>>>>>> The staging repository for this release can be found at:
>>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>>> spark-1248/
>>>>>>
>>>>>> The documentation corresponding to this release can be found at:
>>>>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1
>>>>>> .2-rc1-docs/
>>>>>>
>>>>>>
>>>>>> *FAQ*
>>>>>>
>>>>>> *How can I help test this release?*
>>>>>>
>>>>>> If you are a Spark user, you can help us test this release by taking
>>>>>> an existing Spark workload and running on this release candidate, then
>>>>>> reporting any regressions.
>>>>>>
>>>>>> I

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Xiao Li
-1

See the discussion in https://github.com/apache/spark/pull/19074

Xiao



2017-09-15 12:28 GMT-07:00 Holden Karau <hol...@pigscanfly.ca>:

> That's a good question, I built the release candidate however the Jenkins
> scripts don't take a parameter for configuring who signs them rather it
> always signs them with Patrick's key. You can see this from previous
> releases which were managed by other folks but still signed by Patrick.
>
> On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:
>
>> The signature is valid, but why was the release signed with Patrick
>> Wendell's private key? Did Patrick build the release candidate?
>>
>> rb
>>
>> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com>
>>> wrote:
>>>
>>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>>
>>>> _
>>>> From: Sean Owen <so...@cloudera.com>
>>>> Sent: Thursday, September 14, 2017 3:12 PM
>>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>>
>>>>
>>>>
>>>> +1
>>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>>> tests.
>>>>
>>>> Yes as you say, no outstanding issues except for this which doesn't
>>>> look critical, as it's not a regression.
>>>>
>>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>>
>>>>
>>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>>> wrote:
>>>>
>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>>>
>>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>>> [ ] -1 Do not release this package because ...
>>>>>
>>>>>
>>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>>
>>>>> The tag to be voted on is v2.1.2-rc1
>>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>>>> 6999dd36cb33f528afe627c12)
>>>>>
>>>>> List of JIRA tickets resolved in this release can be found with this
>>>>> filter.
>>>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>>>
>>>>> The release files, including signatures, digests, etc. can be found at:
>>>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>>>
>>>>> Release artifacts are signed with the following key:
>>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>>
>>>>> The staging repository for this release can be found at:
>>>>> https://repository.apache.org/content/repositories/orgapache
>>>>> spark-1248/
>>>>>
>>>>> The documentation corresponding to this release can be found at:
>>>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1
>>>>> .2-rc1-docs/
>>>>>
>>>>>
>>>>> *FAQ*
>>>>>
>>>>> *How can I help test this release?*
>>>>>
>>>>> If you are a Spark user, you can help us test this release by taking
>>>>> an existing Spark workload and running on this release candidate, then
>>>>> reporting any regressions.
>>>>>
>>>>> If you're working in PySpark you can set up a virtual env and install
>>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>>> can add the staging repository to your projects resolvers and test with 
>>>>> the
>>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>>> up building with a out of date RC going forward).
>>>>>
>>>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>>>
>>>>>

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Holden Karau
That's a good question, I built the release candidate however the Jenkins
scripts don't take a parameter for configuring who signs them rather it
always signs them with Patrick's key. You can see this from previous
releases which were managed by other folks but still signed by Patrick.

On Fri, Sep 15, 2017 at 12:16 PM, Ryan Blue <rb...@netflix.com> wrote:

> The signature is valid, but why was the release signed with Patrick
> Wendell's private key? Did Patrick build the release candidate?
>
> rb
>
> On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com> wrote:
>
>> +1 (non-binding)
>>
>> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com>
>> wrote:
>>
>>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>>
>>> _
>>> From: Sean Owen <so...@cloudera.com>
>>> Sent: Thursday, September 14, 2017 3:12 PM
>>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>>
>>>
>>>
>>> +1
>>> Very nice. The sigs and hashes look fine, it builds fine for me on
>>> Debian Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes
>>> tests.
>>>
>>> Yes as you say, no outstanding issues except for this which doesn't look
>>> critical, as it's not a regression.
>>>
>>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>>
>>>
>>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>>> wrote:
>>>
>>>> Please vote on releasing the following candidate as Apache Spark
>>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>>
>>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>>> [ ] -1 Do not release this package because ...
>>>>
>>>>
>>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>>
>>>> The tag to be voted on is v2.1.2-rc1
>>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>>> 6999dd36cb33f528afe627c12)
>>>>
>>>> List of JIRA tickets resolved in this release can be found with this
>>>> filter.
>>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>>
>>>> The release files, including signatures, digests, etc. can be found at:
>>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>>
>>>> Release artifacts are signed with the following key:
>>>> https://people.apache.org/keys/committer/pwendell.asc
>>>>
>>>> The staging repository for this release can be found at:
>>>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>>>
>>>> The documentation corresponding to this release can be found at:
>>>> https://people.apache.org/~pwendell/spark-releases/spark-2.
>>>> 1.2-rc1-docs/
>>>>
>>>>
>>>> *FAQ*
>>>>
>>>> *How can I help test this release?*
>>>>
>>>> If you are a Spark user, you can help us test this release by taking an
>>>> existing Spark workload and running on this release candidate, then
>>>> reporting any regressions.
>>>>
>>>> If you're working in PySpark you can set up a virtual env and install
>>>> the current RC and see if anything important breaks, in the Java/Scala you
>>>> can add the staging repository to your projects resolvers and test with the
>>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>>> up building with a out of date RC going forward).
>>>>
>>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>>
>>>> Committers should look at those and triage. Extremely important bug
>>>> fixes, documentation, and API tweaks that impact compatibility should be
>>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>>
>>>> *But my bug isn't fixed!??!*
>>>>
>>>> In order to make timely releases, we will typically not hold the
>>>> release unless the bug in question is a regression from 2.1.1. That being
>>>> said if there is something which is a regressi

Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Ryan Blue
The signature is valid, but why was the release signed with Patrick
Wendell's private key? Did Patrick build the release candidate?

rb

On Fri, Sep 15, 2017 at 6:36 AM, Denny Lee <denny.g@gmail.com> wrote:

> +1 (non-binding)
>
> On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com>
> wrote:
>
>> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>>
>> _
>> From: Sean Owen <so...@cloudera.com>
>> Sent: Thursday, September 14, 2017 3:12 PM
>> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
>> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>>
>>
>>
>> +1
>> Very nice. The sigs and hashes look fine, it builds fine for me on Debian
>> Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests.
>>
>> Yes as you say, no outstanding issues except for this which doesn't look
>> critical, as it's not a regression.
>>
>> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>>
>>
>> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca>
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc1
>>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (6f470323a036365
>>> 6999dd36cb33f528afe627c12)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://people.apache.org/keys/committer/pwendell.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala you
>>> can add the staging repository to your projects resolvers and test with the
>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>> up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said if
>>> there is something which is a regression form 2.1.1 that has not been
>>> correctly targeted please ping a committer to help target the issue (you
>>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> <https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20%3D%20OPEN%20AND%20(affectedVersion%20%3D%202.1.2%20OR%20affectedVersion%20%3D%202.1.1)>
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> <https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20%22Target%20Version%2Fs%22%20%3D%202.1.2>
>>> ?
>>>
>>> At the time of the writing, there is one in progress major issue
>>> SPARK-21985 <https://issues.apache.org/jira/browse/SPARK-21985>, I
>>> believe Andrew Ray & HyukjinKwon are looking into this one.
>>>
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>>
>>


-- 
Ryan Blue
Software Engineer
Netflix


Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-15 Thread Denny Lee
+1 (non-binding)

On Thu, Sep 14, 2017 at 10:57 PM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> +1 tested SparkR package on Windows, r-hub, Ubuntu.
>
> _
> From: Sean Owen <so...@cloudera.com>
> Sent: Thursday, September 14, 2017 3:12 PM
> Subject: Re: [VOTE] Spark 2.1.2 (RC1)
> To: Holden Karau <hol...@pigscanfly.ca>, <dev@spark.apache.org>
>
>
>
> +1
> Very nice. The sigs and hashes look fine, it builds fine for me on Debian
> Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests.
>
> Yes as you say, no outstanding issues except for this which doesn't look
> critical, as it's not a regression.
>
> SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs
>
>
> On Thu, Sep 14, 2017 at 7:47 PM Holden Karau <hol...@pigscanfly.ca> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Friday September 22nd at 18:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc1
>> <https://github.com/apache/spark/tree/v2.1.2-rc1> (
>> 6f470323a0363656999dd36cb33f528afe627c12)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> <https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said if
>> there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> <https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20%3D%20OPEN%20AND%20(affectedVersion%20%3D%202.1.2%20OR%20affectedVersion%20%3D%202.1.1)>
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> <https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20%22Target%20Version%2Fs%22%20%3D%202.1.2>
>> ?
>>
>> At the time of the writing, there is one in progress major issue
>> SPARK-21985 <https://issues.apache.org/jira/browse/SPARK-21985>, I
>> believe Andrew Ray & HyukjinKwon are looking into this one.
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>
>
>


Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-14 Thread Felix Cheung
+1 tested SparkR package on Windows, r-hub, Ubuntu.

_
From: Sean Owen <so...@cloudera.com<mailto:so...@cloudera.com>>
Sent: Thursday, September 14, 2017 3:12 PM
Subject: Re: [VOTE] Spark 2.1.2 (RC1)
To: Holden Karau <hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>>, 
<dev@spark.apache.org<mailto:dev@spark.apache.org>>


+1
Very nice. The sigs and hashes look fine, it builds fine for me on Debian 
Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests.

Yes as you say, no outstanding issues except for this which doesn't look 
critical, as it's not a regression.

SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs


On Thu, Sep 14, 2017 at 7:47 PM Holden Karau 
<hol...@pigscanfly.ca<mailto:hol...@pigscanfly.ca>> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.1.2. 
The vote is open until Friday September 22nd at 18:00 PST and passes if a 
majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is 
v2.1.2-rc1<https://github.com/apache/spark/tree/v2.1.2-rc1> 
(6f470323a0363656999dd36cb33f528afe627c12)

List of JIRA tickets resolved in this release can be found with this 
filter.<https://issues.apache.org/jira/browse/SPARK-20134?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2>

The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/

Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1248/

The documentation corresponding to this release can be found at:
https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/


FAQ

How can I help test this release?

If you are a Spark user, you can help us test this release by taking an 
existing Spark workload and running on this release candidate, then reporting 
any regressions.

If you're working in PySpark you can set up a virtual env and install the 
current RC and see if anything important breaks, in the Java/Scala you can add 
the staging repository to your projects resolvers and test with the RC (make 
sure to clean up the artifact cache before/after so you don't end up building 
with a out of date RC going forward).

What should happen to JIRA tickets still targeting 2.1.2?

Committers should look at those and triage. Extremely important bug fixes, 
documentation, and API tweaks that impact compatibility should be worked on 
immediately. Everything else please retarget to 2.1.3.

But my bug isn't fixed!??!

In order to make timely releases, we will typically not hold the release unless 
the bug in question is a regression from 2.1.1. That being said if there is 
something which is a regression form 2.1.1 that has not been correctly targeted 
please ping a committer to help target the issue (you can see the open issues 
listed as impacting Spark 2.1.1 & 
2.1.2<https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20%3D%20OPEN%20AND%20(affectedVersion%20%3D%202.1.2%20OR%20affectedVersion%20%3D%202.1.1)>)

What are the unresolved issues targeted for 
2.1.2<https://issues.apache.org/jira/browse/SPARK-21985?jql=project%20%3D%20SPARK%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20%22Target%20Version%2Fs%22%20%3D%202.1.2>?

At the time of the writing, there is one in progress major issue 
SPARK-21985<https://issues.apache.org/jira/browse/SPARK-21985>, I believe 
Andrew Ray & HyukjinKwon are looking into this one.

--
Twitter: https://twitter.com/holdenkarau




Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-14 Thread Sean Owen
+1
Very nice. The sigs and hashes look fine, it builds fine for me on Debian
Stretch with Java 8, yarn/hive/hadoop-2.7 profiles, and passes tests.

Yes as you say, no outstanding issues except for this which doesn't look
critical, as it's not a regression.

SPARK-21985 PySpark PairDeserializer is broken for double-zipped RDDs


On Thu, Sep 14, 2017 at 7:47 PM Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.1.2. The vote is open until Friday September 22nd at 18:00 PST and
> passes if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc1
>  (
> 6f470323a0363656999dd36cb33f528afe627c12)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1248/
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At the time of the writing, there is one in progress major issue
> SPARK-21985 , I
> believe Andrew Ray & HyukjinKwon are looking into this one.
>
> --
> Twitter: https://twitter.com/holdenkarau
>


Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-14 Thread Dongjoon Hyun
Yea. I think I found the root cause.

The correct one is the following as Sean said.


https://issues.apache.org/jira/issues/?jql=project%20%3D%20SPARK%20AND%20fixVersion%20%3D%202.1.2

The current RC vote email has the following.

  List of JIRA tickets resolved in this release can be found with this
filter.
  

You can see the linke from the Apache archive.

https://lists.apache.org/thread.html/52952da40593624737533cdcfad695977612a298ce26205af3f2306c@%3Cdev.spark.apache.org%3E

SPARK-20134 is 2.1.1, so I was confused.

Thanks,
Dongjoon.


On Thu, Sep 14, 2017 at 12:18 PM, Sean Owen  wrote:

> I think the search filter is OK, but for whatever reason the filter link
> includes what JIRA you're currently browsing, and that one is not actually
> included in the filter. It opens on a JIRA that's not included, but the
> search results look correct.   project = SPARK AND fixVersion = 2.1.2
>
> On Thu, Sep 14, 2017 at 9:15 PM Dongjoon Hyun 
> wrote:
>
>> Hi, Holden.
>>
>> It's not a problem, but the link of `List of JIRA ... with this filter`
>> seems to be wrong.
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Thu, Sep 14, 2017 at 10:47 AM, Holden Karau 
>> wrote:
>>
>>> Please vote on releasing the following candidate as Apache Spark
>>> version 2.1.2. The vote is open until Friday September 22nd at 18:00
>>> PST and passes if a majority of at least 3 +1 PMC votes are cast.
>>>
>>> [ ] +1 Release this package as Apache Spark 2.1.2
>>> [ ] -1 Do not release this package because ...
>>>
>>>
>>> To learn more about Apache Spark, please see https://spark.apache.org/
>>>
>>> The tag to be voted on is v2.1.2-rc1
>>>  (6f470323a036365
>>> 6999dd36cb33f528afe627c12)
>>>
>>> List of JIRA tickets resolved in this release can be found with this
>>> filter.
>>> 
>>>
>>> The release files, including signatures, digests, etc. can be found at:
>>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>>
>>> Release artifacts are signed with the following key:
>>> https://people.apache.org/keys/committer/pwendell.asc
>>>
>>> The staging repository for this release can be found at:
>>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>>
>>> The documentation corresponding to this release can be found at:
>>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>>
>>>
>>> *FAQ*
>>>
>>> *How can I help test this release?*
>>>
>>> If you are a Spark user, you can help us test this release by taking an
>>> existing Spark workload and running on this release candidate, then
>>> reporting any regressions.
>>>
>>> If you're working in PySpark you can set up a virtual env and install
>>> the current RC and see if anything important breaks, in the Java/Scala you
>>> can add the staging repository to your projects resolvers and test with the
>>> RC (make sure to clean up the artifact cache before/after so you don't end
>>> up building with a out of date RC going forward).
>>>
>>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>>
>>> Committers should look at those and triage. Extremely important bug
>>> fixes, documentation, and API tweaks that impact compatibility should be
>>> worked on immediately. Everything else please retarget to 2.1.3.
>>>
>>> *But my bug isn't fixed!??!*
>>>
>>> In order to make timely releases, we will typically not hold the release
>>> unless the bug in question is a regression from 2.1.1. That being said if
>>> there is something which is a regression form 2.1.1 that has not been
>>> correctly targeted please ping a committer to help target the issue (you
>>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>>> 
>>> )
>>>
>>> *What are the unresolved* issues targeted for 2.1.2
>>> 
>>> ?
>>>
>>> At the time of the writing, there is one in progress major issue
>>> SPARK-21985 , I
>>> believe Andrew Ray & HyukjinKwon are looking into this one.
>>>
>>> --
>>> Twitter: https://twitter.com/holdenkarau
>>>
>>
>>


Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-14 Thread Sean Owen
I think the search filter is OK, but for whatever reason the filter link
includes what JIRA you're currently browsing, and that one is not actually
included in the filter. It opens on a JIRA that's not included, but the
search results look correct.   project = SPARK AND fixVersion = 2.1.2

On Thu, Sep 14, 2017 at 9:15 PM Dongjoon Hyun 
wrote:

> Hi, Holden.
>
> It's not a problem, but the link of `List of JIRA ... with this filter`
> seems to be wrong.
>
> Bests,
> Dongjoon.
>
>
> On Thu, Sep 14, 2017 at 10:47 AM, Holden Karau 
> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 2.1.2. The vote is open until Friday September 22nd at 18:00 PST and
>> passes if a majority of at least 3 +1 PMC votes are cast.
>>
>> [ ] +1 Release this package as Apache Spark 2.1.2
>> [ ] -1 Do not release this package because ...
>>
>>
>> To learn more about Apache Spark, please see https://spark.apache.org/
>>
>> The tag to be voted on is v2.1.2-rc1
>>  (
>> 6f470323a0363656999dd36cb33f528afe627c12)
>>
>> List of JIRA tickets resolved in this release can be found with this
>> filter.
>> 
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>>
>> Release artifacts are signed with the following key:
>> https://people.apache.org/keys/committer/pwendell.asc
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1248/
>>
>> The documentation corresponding to this release can be found at:
>> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>>
>>
>> *FAQ*
>>
>> *How can I help test this release?*
>>
>> If you are a Spark user, you can help us test this release by taking an
>> existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install the
>> current RC and see if anything important breaks, in the Java/Scala you can
>> add the staging repository to your projects resolvers and test with the RC
>> (make sure to clean up the artifact cache before/after so you don't end up
>> building with a out of date RC going forward).
>>
>> *What should happen to JIRA tickets still targeting 2.1.2?*
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should be
>> worked on immediately. Everything else please retarget to 2.1.3.
>>
>> *But my bug isn't fixed!??!*
>>
>> In order to make timely releases, we will typically not hold the release
>> unless the bug in question is a regression from 2.1.1. That being said if
>> there is something which is a regression form 2.1.1 that has not been
>> correctly targeted please ping a committer to help target the issue (you
>> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
>> 
>> )
>>
>> *What are the unresolved* issues targeted for 2.1.2
>> 
>> ?
>>
>> At the time of the writing, there is one in progress major issue
>> SPARK-21985 , I
>> believe Andrew Ray & HyukjinKwon are looking into this one.
>>
>> --
>> Twitter: https://twitter.com/holdenkarau
>>
>
>


Re: [VOTE] Spark 2.1.2 (RC1)

2017-09-14 Thread Dongjoon Hyun
Hi, Holden.

It's not a problem, but the link of `List of JIRA ... with this filter`
seems to be wrong.

Bests,
Dongjoon.


On Thu, Sep 14, 2017 at 10:47 AM, Holden Karau  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.1.2. The vote is open until Friday September 22nd at 18:00 PST and
> passes if a majority of at least 3 +1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.1.2
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.1.2-rc1
>  (6f470323a036365
> 6999dd36cb33f528afe627c12)
>
> List of JIRA tickets resolved in this release can be found with this
> filter.
> 
>
> The release files, including signatures, digests, etc. can be found at:
> https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1248/
>
> The documentation corresponding to this release can be found at:
> https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/
>
>
> *FAQ*
>
> *How can I help test this release?*
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> *What should happen to JIRA tickets still targeting 2.1.2?*
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.1.3.
>
> *But my bug isn't fixed!??!*
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.1.1. That being said if
> there is something which is a regression form 2.1.1 that has not been
> correctly targeted please ping a committer to help target the issue (you
> can see the open issues listed as impacting Spark 2.1.1 & 2.1.2
> 
> )
>
> *What are the unresolved* issues targeted for 2.1.2
> 
> ?
>
> At the time of the writing, there is one in progress major issue
> SPARK-21985 , I
> believe Andrew Ray & HyukjinKwon are looking into this one.
>
> --
> Twitter: https://twitter.com/holdenkarau
>


[VOTE] Spark 2.1.2 (RC1)

2017-09-14 Thread Holden Karau
Please vote on releasing the following candidate as Apache Spark version
2.1.2. The vote is open until Friday September 22nd at 18:00 PST and passes
if a majority of at least 3 +1 PMC votes are cast.

[ ] +1 Release this package as Apache Spark 2.1.2
[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.1.2-rc1
 (6f470323a036365
6999dd36cb33f528afe627c12)

List of JIRA tickets resolved in this release can be found with this filter.


The release files, including signatures, digests, etc. can be found at:
https://home.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-bin/

Release artifacts are signed with the following key:
https://people.apache.org/keys/committer/pwendell.asc

The staging repository for this release can be found at:
https://repository.apache.org/content/repositories/orgapachespark-1248/

The documentation corresponding to this release can be found at:
https://people.apache.org/~pwendell/spark-releases/spark-2.1.2-rc1-docs/


*FAQ*

*How can I help test this release?*

If you are a Spark user, you can help us test this release by taking an
existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the
current RC and see if anything important breaks, in the Java/Scala you can
add the staging repository to your projects resolvers and test with the RC
(make sure to clean up the artifact cache before/after so you don't end up
building with a out of date RC going forward).

*What should happen to JIRA tickets still targeting 2.1.2?*

Committers should look at those and triage. Extremely important bug fixes,
documentation, and API tweaks that impact compatibility should be worked on
immediately. Everything else please retarget to 2.1.3.

*But my bug isn't fixed!??!*

In order to make timely releases, we will typically not hold the release
unless the bug in question is a regression from 2.1.1. That being said if
there is something which is a regression form 2.1.1 that has not been
correctly targeted please ping a committer to help target the issue (you
can see the open issues listed as impacting Spark 2.1.1 & 2.1.2

)

*What are the unresolved* issues targeted for 2.1.2

?

At the time of the writing, there is one in progress major issue SPARK-21985
, I believe Andrew Ray &
HyukjinKwon are looking into this one.

-- 
Twitter: https://twitter.com/holdenkarau