Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-29 Thread Jacek Laskowski
Hi Marcelo,

The reason I asked about the mesos profile was that I thought it was
part of the branch already and wondered why nobody used it to compile
Spark with all the code available.

I do understand no code changes were introduced during this profile
maintenance, but with the profile that code does not get compiled
unless you enable the profile explicitly. I've learnt it's not part of
the release, though.

Thanks for all the clarifications! I appreciate your patience dealing
with my questions a lot! Thanks.

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Mon, Sep 26, 2016 at 7:08 PM, Marcelo Vanzin  wrote:
> The part I don't understand is: why do you care so much about the mesos 
> profile?
>
> The same code exists in branch-2.0, it just doesn't need a separate
> profile to be enabled (it's part of core). As Sean said, the change in
> master was purely organizational, there's no added or lost
> functionality.
>
> On Sun, Sep 25, 2016 at 8:31 AM, Jacek Laskowski  wrote:
>> Hi Sean,
>>
>> I remember a similar discussion about the releases in Spark and I must
>> admit it again -- I simply don't get it. I seem to not have paid
>> enough attention to details to appreciate it. I apologize for asking
>> the very same questions again and again. Sorry.
>>
>> Re the next release, I was referring to JIRA where 2.0.2 came up quite
>> recently for issues not included in 2.0.1. This disjoint between
>> releases and JIRA versions causes even more frustration whenever I'm
>> asked what and when the next release is going to be. It's not as
>> simple as I think it should be (for me).
>>
>> (I really hope it's only me with this mental issue)
>>
>> Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
>> unless someone adds it to branch-2.0. Correct?
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
>>> Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
>>> becomes the de facto 2.2.x branch. It's not true that the next release
>>> is 2.0.2. You can see the master version:
>>> https://github.com/apache/spark/blob/master/pom.xml#L29
>>>
>>> On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski  wrote:
 Hi Sean,

 So, another question would be when is the change going to be released
 then? What's the version for the master? The next release's 2.0.2 so
 it's not for mesos profile either :(

 Pozdrawiam,
 Jacek Laskowski
 
 https://medium.com/@jaceklaskowski/
 Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
 Follow me at https://twitter.com/jaceklaskowski


 On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
> It's a change to the structure of the project, and probably not
> appropriate for a maintenance release. 2.0.1 core would then no longer
> contain Mesos code while 2.0.0 did.
>
> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
>> Hi Sean,
>>
>> Sure, but then the question is why it's not a part of 2.0.1? I thought
>> it was considered ready for prime time and so should be shipped in
>> 2.0.1.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
>>> It was added to the master branch, and this is a release from the 2.0.x 
>>> branch.
>>>
>>> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  
>>> wrote:
 Hi,

 That's even more interesting. How's so since the profile got added a
 week ago or later and RC2 was cut two/three days ago? Anyone know?

>
>
>
> --
> Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-26 Thread Marcelo Vanzin
The part I don't understand is: why do you care so much about the mesos profile?

The same code exists in branch-2.0, it just doesn't need a separate
profile to be enabled (it's part of core). As Sean said, the change in
master was purely organizational, there's no added or lost
functionality.

On Sun, Sep 25, 2016 at 8:31 AM, Jacek Laskowski  wrote:
> Hi Sean,
>
> I remember a similar discussion about the releases in Spark and I must
> admit it again -- I simply don't get it. I seem to not have paid
> enough attention to details to appreciate it. I apologize for asking
> the very same questions again and again. Sorry.
>
> Re the next release, I was referring to JIRA where 2.0.2 came up quite
> recently for issues not included in 2.0.1. This disjoint between
> releases and JIRA versions causes even more frustration whenever I'm
> asked what and when the next release is going to be. It's not as
> simple as I think it should be (for me).
>
> (I really hope it's only me with this mental issue)
>
> Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
> unless someone adds it to branch-2.0. Correct?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
>> Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
>> becomes the de facto 2.2.x branch. It's not true that the next release
>> is 2.0.2. You can see the master version:
>> https://github.com/apache/spark/blob/master/pom.xml#L29
>>
>> On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski  wrote:
>>> Hi Sean,
>>>
>>> So, another question would be when is the change going to be released
>>> then? What's the version for the master? The next release's 2.0.2 so
>>> it's not for mesos profile either :(
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> 
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
 It's a change to the structure of the project, and probably not
 appropriate for a maintenance release. 2.0.1 core would then no longer
 contain Mesos code while 2.0.0 did.

 On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
> Hi Sean,
>
> Sure, but then the question is why it's not a part of 2.0.1? I thought
> it was considered ready for prime time and so should be shipped in
> 2.0.1.
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
>> It was added to the master branch, and this is a release from the 2.0.x 
>> branch.
>>
>> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  
>> wrote:
>>> Hi,
>>>
>>> That's even more interesting. How's so since the profile got added a
>>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>>



-- 
Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Mark Hamstra
Spark's branch-2.0 is a maintenance branch, effectively meaning that only
bug-fixes will be added to it.  There are other maintenance branches (such
as branch-1.6) that are also receiving bug-fixes in theory, but not so much
in fact as maintenance branches get older.  The major and minor version
numbers of maintenance branches stay fixed, with only the patch-level
version number changing as new releases are made from a maintenance
branch.  Thus, the next release from branch-2.0 will be 2.0.1, the set of
bug-fixes contributing to the next branch-2.0 release will result in 2.0.2,
etc.

New work, both bug-fixes and non-bug-fixes, is contributed to the master
branch.  New releases from the master branch increment the minor version
number (unless they include API-breaking changes, in which case the major
version number changes -- e.g. Spark 1.x.y to Spark 2.0.0).  Thus the first
release from the current master branch will be 2.1.0, the next will be
2.2.0, etc.

There should be active "next JIRA numbers" for whatever will be the next
release from the master as well as each of the maintenance branches.

This is all just basic SemVer (http://semver.org/), so it surprises me some
that you are finding the concepts to be new, difficult or frustrating.

On Sun, Sep 25, 2016 at 8:31 AM, Jacek Laskowski  wrote:

> Hi Sean,
>
> I remember a similar discussion about the releases in Spark and I must
> admit it again -- I simply don't get it. I seem to not have paid
> enough attention to details to appreciate it. I apologize for asking
> the very same questions again and again. Sorry.
>
> Re the next release, I was referring to JIRA where 2.0.2 came up quite
> recently for issues not included in 2.0.1. This disjoint between
> releases and JIRA versions causes even more frustration whenever I'm
> asked what and when the next release is going to be. It's not as
> simple as I think it should be (for me).
>
> (I really hope it's only me with this mental issue)
>
> Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
> unless someone adds it to branch-2.0. Correct?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
> > Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
> > becomes the de facto 2.2.x branch. It's not true that the next release
> > is 2.0.2. You can see the master version:
> > https://github.com/apache/spark/blob/master/pom.xml#L29
> >
> > On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski 
> wrote:
> >> Hi Sean,
> >>
> >> So, another question would be when is the change going to be released
> >> then? What's the version for the master? The next release's 2.0.2 so
> >> it's not for mesos profile either :(
> >>
> >> Pozdrawiam,
> >> Jacek Laskowski
> >> 
> >> https://medium.com/@jaceklaskowski/
> >> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> >> Follow me at https://twitter.com/jaceklaskowski
> >>
> >>
> >> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
> >>> It's a change to the structure of the project, and probably not
> >>> appropriate for a maintenance release. 2.0.1 core would then no longer
> >>> contain Mesos code while 2.0.0 did.
> >>>
> >>> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski 
> wrote:
>  Hi Sean,
> 
>  Sure, but then the question is why it's not a part of 2.0.1? I thought
>  it was considered ready for prime time and so should be shipped in
>  2.0.1.
> 
>  Pozdrawiam,
>  Jacek Laskowski
>  
>  https://medium.com/@jaceklaskowski/
>  Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>  Follow me at https://twitter.com/jaceklaskowski
> 
> 
>  On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen 
> wrote:
> > It was added to the master branch, and this is a release from the
> 2.0.x branch.
> >
> > On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski 
> wrote:
> >> Hi,
> >>
> >> That's even more interesting. How's so since the profile got added a
> >> week ago or later and RC2 was cut two/three days ago? Anyone know?
> >>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi Sean,

I remember a similar discussion about the releases in Spark and I must
admit it again -- I simply don't get it. I seem to not have paid
enough attention to details to appreciate it. I apologize for asking
the very same questions again and again. Sorry.

Re the next release, I was referring to JIRA where 2.0.2 came up quite
recently for issues not included in 2.0.1. This disjoint between
releases and JIRA versions causes even more frustration whenever I'm
asked what and when the next release is going to be. It's not as
simple as I think it should be (for me).

(I really hope it's only me with this mental issue)

Unless I'm mistaken, -Pmesos won't get included in 2.0.x releases
unless someone adds it to branch-2.0. Correct?

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 1:35 PM, Sean Owen  wrote:
> Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
> becomes the de facto 2.2.x branch. It's not true that the next release
> is 2.0.2. You can see the master version:
> https://github.com/apache/spark/blob/master/pom.xml#L29
>
> On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski  wrote:
>> Hi Sean,
>>
>> So, another question would be when is the change going to be released
>> then? What's the version for the master? The next release's 2.0.2 so
>> it's not for mesos profile either :(
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
>>> It's a change to the structure of the project, and probably not
>>> appropriate for a maintenance release. 2.0.1 core would then no longer
>>> contain Mesos code while 2.0.0 did.
>>>
>>> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
 Hi Sean,

 Sure, but then the question is why it's not a part of 2.0.1? I thought
 it was considered ready for prime time and so should be shipped in
 2.0.1.

 Pozdrawiam,
 Jacek Laskowski
 
 https://medium.com/@jaceklaskowski/
 Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
 Follow me at https://twitter.com/jaceklaskowski


 On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
> It was added to the master branch, and this is a release from the 2.0.x 
> branch.
>
> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
>> Hi,
>>
>> That's even more interesting. How's so since the profile got added a
>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Sean Owen
Master is implicitly 2.1.x right now. When branch-2.1 is cut, master
becomes the de facto 2.2.x branch. It's not true that the next release
is 2.0.2. You can see the master version:
https://github.com/apache/spark/blob/master/pom.xml#L29

On Sun, Sep 25, 2016 at 12:30 PM, Jacek Laskowski  wrote:
> Hi Sean,
>
> So, another question would be when is the change going to be released
> then? What's the version for the master? The next release's 2.0.2 so
> it's not for mesos profile either :(
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
>> It's a change to the structure of the project, and probably not
>> appropriate for a maintenance release. 2.0.1 core would then no longer
>> contain Mesos code while 2.0.0 did.
>>
>> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
>>> Hi Sean,
>>>
>>> Sure, but then the question is why it's not a part of 2.0.1? I thought
>>> it was considered ready for prime time and so should be shipped in
>>> 2.0.1.
>>>
>>> Pozdrawiam,
>>> Jacek Laskowski
>>> 
>>> https://medium.com/@jaceklaskowski/
>>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>>> Follow me at https://twitter.com/jaceklaskowski
>>>
>>>
>>> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
 It was added to the master branch, and this is a release from the 2.0.x 
 branch.

 On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
> Hi,
>
> That's even more interesting. How's so since the profile got added a
> week ago or later and RC2 was cut two/three days ago? Anyone know?
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi Sean,

So, another question would be when is the change going to be released
then? What's the version for the master? The next release's 2.0.2 so
it's not for mesos profile either :(

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 1:27 PM, Sean Owen  wrote:
> It's a change to the structure of the project, and probably not
> appropriate for a maintenance release. 2.0.1 core would then no longer
> contain Mesos code while 2.0.0 did.
>
> On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
>> Hi Sean,
>>
>> Sure, but then the question is why it's not a part of 2.0.1? I thought
>> it was considered ready for prime time and so should be shipped in
>> 2.0.1.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
>>> It was added to the master branch, and this is a release from the 2.0.x 
>>> branch.
>>>
>>> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
 Hi,

 That's even more interesting. How's so since the profile got added a
 week ago or later and RC2 was cut two/three days ago? Anyone know?


-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Sean Owen
It's a change to the structure of the project, and probably not
appropriate for a maintenance release. 2.0.1 core would then no longer
contain Mesos code while 2.0.0 did.

On Sun, Sep 25, 2016 at 12:26 PM, Jacek Laskowski  wrote:
> Hi Sean,
>
> Sure, but then the question is why it's not a part of 2.0.1? I thought
> it was considered ready for prime time and so should be shipped in
> 2.0.1.
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
>> It was added to the master branch, and this is a release from the 2.0.x 
>> branch.
>>
>> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
>>> Hi,
>>>
>>> That's even more interesting. How's so since the profile got added a
>>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi Sean,

Sure, but then the question is why it's not a part of 2.0.1? I thought
it was considered ready for prime time and so should be shipped in
2.0.1.

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 1:21 PM, Sean Owen  wrote:
> It was added to the master branch, and this is a release from the 2.0.x 
> branch.
>
> On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
>> Hi,
>>
>> That's even more interesting. How's so since the profile got added a
>> week ago or later and RC2 was cut two/three days ago? Anyone know?
>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Sean Owen
It was added to the master branch, and this is a release from the 2.0.x branch.

On Sun, Sep 25, 2016 at 12:12 PM, Jacek Laskowski  wrote:
> Hi,
>
> That's even more interesting. How's so since the profile got added a
> week ago or later and RC2 was cut two/three days ago? Anyone know?
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-25 Thread Jacek Laskowski
Hi,

That's even more interesting. How's so since the profile got added a
week ago or later and RC2 was cut two/three days ago? Anyone know?

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Sep 25, 2016 at 5:09 AM, Marcelo Vanzin  wrote:
> There is no "mesos" profile in 2.0.1.
>
> On Sat, Sep 24, 2016 at 2:19 PM, Jacek Laskowski  wrote:
>> Hi,
>>
>> I keep asking myself why are you guys not including -Pmesos in your
>> builds? Is this on purpose or have you overlooked it?
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> 
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sat, Sep 24, 2016 at 9:25 PM, Dongjoon Hyun  wrote:
>>> +1 (non binding)
>>>
>>> I compiled and tested on the following two systems.
>>>
>>> - CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
>>> -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
>>> - CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
>>> -Phive -Phive-thriftserver
>>>
>>> Bests,
>>> Dongjoon.
>>>
>>>
>>> On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski  wrote:

 Hi,

 Not that it could fix the issue but no -Pmesos?

 Jacek


 On 24 Sep 2016 12:08 a.m., "Sean Owen"  wrote:
>
> +1 Signatures and hashes check out. I checked that the Kinesis
> assembly artifacts are not present.
>
> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
> problem. This test never completed. If nobody else sees it, +1,
> assuming it's a bad test or env issue.
>
> - should clone and clean line object in ClosureCleaner *** FAILED ***
>   isContain was true Interpreter output contained 'Exception':
>   Welcome to
>   __
>/ __/__  ___ _/ /__
>   _\ \/ _ \/ _ `/ __/  '_/
>  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
> /_/
>
>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>   Type in expressions to have them evaluated.
>   Type :help for more information.
>
>   scala> // Entering paste mode (ctrl-D to finish)
>
>
>   // Exiting paste mode, now interpreting.
>
>   org.apache.spark.SparkException: Job 0 cancelled because
> SparkContext was shut down
> at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
> ...
>
>
> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
> > Please vote on releasing the following candidate as Apache Spark
> > version
> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
> > passes
> > if a majority of at least 3+1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 2.0.1
> > [ ] -1 Do not release this package because ...
> >
> >
> > The tag to be voted on is v2.0.1-rc2
> > (04141ad49806a48afccc236b699827997142bd57)
> >
> > This release candidate resolves 284 issues:
> > https://s.apache.org/spark-2.0.1-jira
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1199
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
> >
> >
> > Q: How can I help test this release?
> > A: If you are a Spark user, you can help us test this release by taking
> > an
> > existing Spark workload and running on this release candidate, then
> > reporting any regressions from 2.0.0.
> >
> > Q: What justifies a -1 vote for this release?
> > A: This is a maintenance release in the 2.0.x series.  Bugs already
> > present
> > in 2.0.0, missing features, or bugs related to new features will not
> > necessarily block this release.
> >
> > Q: What happened to 2.0.1 RC1?
> > A: There was an issue with RC1 R documentation during release candidate
> > preparation. As a result, rc1 was canceled before a vote was called.
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-24 Thread Marcelo Vanzin
There is no "mesos" profile in 2.0.1.

On Sat, Sep 24, 2016 at 2:19 PM, Jacek Laskowski  wrote:
> Hi,
>
> I keep asking myself why are you guys not including -Pmesos in your
> builds? Is this on purpose or have you overlooked it?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Sep 24, 2016 at 9:25 PM, Dongjoon Hyun  wrote:
>> +1 (non binding)
>>
>> I compiled and tested on the following two systems.
>>
>> - CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
>> -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
>> - CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
>> -Phive -Phive-thriftserver
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski  wrote:
>>>
>>> Hi,
>>>
>>> Not that it could fix the issue but no -Pmesos?
>>>
>>> Jacek
>>>
>>>
>>> On 24 Sep 2016 12:08 a.m., "Sean Owen"  wrote:

 +1 Signatures and hashes check out. I checked that the Kinesis
 assembly artifacts are not present.

 I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
 -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
 problem. This test never completed. If nobody else sees it, +1,
 assuming it's a bad test or env issue.

 - should clone and clean line object in ClosureCleaner *** FAILED ***
   isContain was true Interpreter output contained 'Exception':
   Welcome to
   __
/ __/__  ___ _/ /__
   _\ \/ _ \/ _ `/ __/  '_/
  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
 /_/

   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
   Type in expressions to have them evaluated.
   Type :help for more information.

   scala> // Entering paste mode (ctrl-D to finish)


   // Exiting paste mode, now interpreting.

   org.apache.spark.SparkException: Job 0 cancelled because
 SparkContext was shut down
 at
 org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
 ...


 On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
 > Please vote on releasing the following candidate as Apache Spark
 > version
 > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
 > passes
 > if a majority of at least 3+1 PMC votes are cast.
 >
 > [ ] +1 Release this package as Apache Spark 2.0.1
 > [ ] -1 Do not release this package because ...
 >
 >
 > The tag to be voted on is v2.0.1-rc2
 > (04141ad49806a48afccc236b699827997142bd57)
 >
 > This release candidate resolves 284 issues:
 > https://s.apache.org/spark-2.0.1-jira
 >
 > The release files, including signatures, digests, etc. can be found at:
 > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
 >
 > Release artifacts are signed with the following key:
 > https://people.apache.org/keys/committer/pwendell.asc
 >
 > The staging repository for this release can be found at:
 > https://repository.apache.org/content/repositories/orgapachespark-1199
 >
 > The documentation corresponding to this release can be found at:
 > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
 >
 >
 > Q: How can I help test this release?
 > A: If you are a Spark user, you can help us test this release by taking
 > an
 > existing Spark workload and running on this release candidate, then
 > reporting any regressions from 2.0.0.
 >
 > Q: What justifies a -1 vote for this release?
 > A: This is a maintenance release in the 2.0.x series.  Bugs already
 > present
 > in 2.0.0, missing features, or bugs related to new features will not
 > necessarily block this release.
 >
 > Q: What happened to 2.0.1 RC1?
 > A: There was an issue with RC1 R documentation during release candidate
 > preparation. As a result, rc1 was canceled before a vote was called.
 >

 -
 To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

>>
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>



-- 
Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-24 Thread Reynold Xin
Hi all,

The R API documentation version error was reported in a separate thread.
I've built a release candidate (RC3) and will send out a new vote email in
a bit.

On Thu, Sep 22, 2016 at 11:01 PM, Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and passes
> if a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc2 (04141ad49806a48afccc236b699827
> 997142bd57)
>
> This release candidate resolves 284 issues: https://s.apache.org/spark-2.
> 0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1199
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already
> present in 2.0.0, missing features, or bugs related to new features will
> not necessarily block this release.
>
> Q: What happened to 2.0.1 RC1?
> A: There was an issue with RC1 R documentation during release candidate
> preparation. As a result, rc1 was canceled before a vote was called.
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-24 Thread Sean Owen
The binary artifact that's published here is built with -Pmesos. The
'real' artifact from a process standpoint is the source release,
however. That's why we do (should) test the source release foremost.

I suppose individuals are invited to test with a configuration that is
of interest to them, and so I enable just the flags I would care to
test I suppose.

On Sat, Sep 24, 2016 at 10:19 PM, Jacek Laskowski  wrote:
> Hi,
>
> I keep asking myself why are you guys not including -Pmesos in your
> builds? Is this on purpose or have you overlooked it?
>
> Pozdrawiam,
> Jacek Laskowski
> 
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Sep 24, 2016 at 9:25 PM, Dongjoon Hyun  wrote:
>> +1 (non binding)
>>
>> I compiled and tested on the following two systems.
>>
>> - CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
>> -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
>> - CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
>> -Phive -Phive-thriftserver
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski  wrote:
>>>
>>> Hi,
>>>
>>> Not that it could fix the issue but no -Pmesos?
>>>
>>> Jacek
>>>
>>>
>>> On 24 Sep 2016 12:08 a.m., "Sean Owen"  wrote:

 +1 Signatures and hashes check out. I checked that the Kinesis
 assembly artifacts are not present.

 I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
 -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
 problem. This test never completed. If nobody else sees it, +1,
 assuming it's a bad test or env issue.

 - should clone and clean line object in ClosureCleaner *** FAILED ***
   isContain was true Interpreter output contained 'Exception':
   Welcome to
   __
/ __/__  ___ _/ /__
   _\ \/ _ \/ _ `/ __/  '_/
  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
 /_/

   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
   Type in expressions to have them evaluated.
   Type :help for more information.

   scala> // Entering paste mode (ctrl-D to finish)


   // Exiting paste mode, now interpreting.

   org.apache.spark.SparkException: Job 0 cancelled because
 SparkContext was shut down
 at
 org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
 ...


 On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
 > Please vote on releasing the following candidate as Apache Spark
 > version
 > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
 > passes
 > if a majority of at least 3+1 PMC votes are cast.
 >
 > [ ] +1 Release this package as Apache Spark 2.0.1
 > [ ] -1 Do not release this package because ...
 >
 >
 > The tag to be voted on is v2.0.1-rc2
 > (04141ad49806a48afccc236b699827997142bd57)
 >
 > This release candidate resolves 284 issues:
 > https://s.apache.org/spark-2.0.1-jira
 >
 > The release files, including signatures, digests, etc. can be found at:
 > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
 >
 > Release artifacts are signed with the following key:
 > https://people.apache.org/keys/committer/pwendell.asc
 >
 > The staging repository for this release can be found at:
 > https://repository.apache.org/content/repositories/orgapachespark-1199
 >
 > The documentation corresponding to this release can be found at:
 > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
 >
 >
 > Q: How can I help test this release?
 > A: If you are a Spark user, you can help us test this release by taking
 > an
 > existing Spark workload and running on this release candidate, then
 > reporting any regressions from 2.0.0.
 >
 > Q: What justifies a -1 vote for this release?
 > A: This is a maintenance release in the 2.0.x series.  Bugs already
 > present
 > in 2.0.0, missing features, or bugs related to new features will not
 > necessarily block this release.
 >
 > Q: What happened to 2.0.1 RC1?
 > A: There was an issue with RC1 R documentation during release candidate
 > preparation. As a result, rc1 was canceled before a vote was called.
 >

 -
 To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

>>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-24 Thread Jacek Laskowski
Hi,

I keep asking myself why are you guys not including -Pmesos in your
builds? Is this on purpose or have you overlooked it?

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sat, Sep 24, 2016 at 9:25 PM, Dongjoon Hyun  wrote:
> +1 (non binding)
>
> I compiled and tested on the following two systems.
>
> - CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
> -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
> - CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
> -Phive -Phive-thriftserver
>
> Bests,
> Dongjoon.
>
>
> On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski  wrote:
>>
>> Hi,
>>
>> Not that it could fix the issue but no -Pmesos?
>>
>> Jacek
>>
>>
>> On 24 Sep 2016 12:08 a.m., "Sean Owen"  wrote:
>>>
>>> +1 Signatures and hashes check out. I checked that the Kinesis
>>> assembly artifacts are not present.
>>>
>>> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
>>> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
>>> problem. This test never completed. If nobody else sees it, +1,
>>> assuming it's a bad test or env issue.
>>>
>>> - should clone and clean line object in ClosureCleaner *** FAILED ***
>>>   isContain was true Interpreter output contained 'Exception':
>>>   Welcome to
>>>   __
>>>/ __/__  ___ _/ /__
>>>   _\ \/ _ \/ _ `/ __/  '_/
>>>  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
>>> /_/
>>>
>>>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>>>   Type in expressions to have them evaluated.
>>>   Type :help for more information.
>>>
>>>   scala> // Entering paste mode (ctrl-D to finish)
>>>
>>>
>>>   // Exiting paste mode, now interpreting.
>>>
>>>   org.apache.spark.SparkException: Job 0 cancelled because
>>> SparkContext was shut down
>>> at
>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
>>> ...
>>>
>>>
>>> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
>>> > Please vote on releasing the following candidate as Apache Spark
>>> > version
>>> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
>>> > passes
>>> > if a majority of at least 3+1 PMC votes are cast.
>>> >
>>> > [ ] +1 Release this package as Apache Spark 2.0.1
>>> > [ ] -1 Do not release this package because ...
>>> >
>>> >
>>> > The tag to be voted on is v2.0.1-rc2
>>> > (04141ad49806a48afccc236b699827997142bd57)
>>> >
>>> > This release candidate resolves 284 issues:
>>> > https://s.apache.org/spark-2.0.1-jira
>>> >
>>> > The release files, including signatures, digests, etc. can be found at:
>>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
>>> >
>>> > Release artifacts are signed with the following key:
>>> > https://people.apache.org/keys/committer/pwendell.asc
>>> >
>>> > The staging repository for this release can be found at:
>>> > https://repository.apache.org/content/repositories/orgapachespark-1199
>>> >
>>> > The documentation corresponding to this release can be found at:
>>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
>>> >
>>> >
>>> > Q: How can I help test this release?
>>> > A: If you are a Spark user, you can help us test this release by taking
>>> > an
>>> > existing Spark workload and running on this release candidate, then
>>> > reporting any regressions from 2.0.0.
>>> >
>>> > Q: What justifies a -1 vote for this release?
>>> > A: This is a maintenance release in the 2.0.x series.  Bugs already
>>> > present
>>> > in 2.0.0, missing features, or bugs related to new features will not
>>> > necessarily block this release.
>>> >
>>> > Q: What happened to 2.0.1 RC1?
>>> > A: There was an issue with RC1 R documentation during release candidate
>>> > preparation. As a result, rc1 was canceled before a vote was called.
>>> >
>>>
>>> -
>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-24 Thread Dongjoon Hyun
+1 (non binding)

I compiled and tested on the following two systems.

- CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
-Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
- CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
-Phive -Phive-thriftserver

Bests,
Dongjoon.


On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski  wrote:

> Hi,
>
> Not that it could fix the issue but no -Pmesos?
>
> Jacek
>
> On 24 Sep 2016 12:08 a.m., "Sean Owen"  wrote:
>
>> +1 Signatures and hashes check out. I checked that the Kinesis
>> assembly artifacts are not present.
>>
>> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
>> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
>> problem. This test never completed. If nobody else sees it, +1,
>> assuming it's a bad test or env issue.
>>
>> - should clone and clean line object in ClosureCleaner *** FAILED ***
>>   isContain was true Interpreter output contained 'Exception':
>>   Welcome to
>>   __
>>/ __/__  ___ _/ /__
>>   _\ \/ _ \/ _ `/ __/  '_/
>>  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
>> /_/
>>
>>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>>   Type in expressions to have them evaluated.
>>   Type :help for more information.
>>
>>   scala> // Entering paste mode (ctrl-D to finish)
>>
>>
>>   // Exiting paste mode, now interpreting.
>>
>>   org.apache.spark.SparkException: Job 0 cancelled because
>> SparkContext was shut down
>> at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfte
>> rSchedulerStop$1.apply(DAGScheduler.scala:818)
>> ...
>>
>>
>> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
>> > Please vote on releasing the following candidate as Apache Spark version
>> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
>> passes
>> > if a majority of at least 3+1 PMC votes are cast.
>> >
>> > [ ] +1 Release this package as Apache Spark 2.0.1
>> > [ ] -1 Do not release this package because ...
>> >
>> >
>> > The tag to be voted on is v2.0.1-rc2
>> > (04141ad49806a48afccc236b699827997142bd57)
>> >
>> > This release candidate resolves 284 issues:
>> > https://s.apache.org/spark-2.0.1-jira
>> >
>> > The release files, including signatures, digests, etc. can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
>> >
>> > Release artifacts are signed with the following key:
>> > https://people.apache.org/keys/committer/pwendell.asc
>> >
>> > The staging repository for this release can be found at:
>> > https://repository.apache.org/content/repositories/orgapachespark-1199
>> >
>> > The documentation corresponding to this release can be found at:
>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
>> >
>> >
>> > Q: How can I help test this release?
>> > A: If you are a Spark user, you can help us test this release by taking
>> an
>> > existing Spark workload and running on this release candidate, then
>> > reporting any regressions from 2.0.0.
>> >
>> > Q: What justifies a -1 vote for this release?
>> > A: This is a maintenance release in the 2.0.x series.  Bugs already
>> present
>> > in 2.0.0, missing features, or bugs related to new features will not
>> > necessarily block this release.
>> >
>> > Q: What happened to 2.0.1 RC1?
>> > A: There was an issue with RC1 R documentation during release candidate
>> > preparation. As a result, rc1 was canceled before a vote was called.
>> >
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-23 Thread vaquar khan
+1 non binding
No issue found.
Regards,
Vaquar khan

On 23 Sep 2016 17:25, "Mark Hamstra"  wrote:

Similar but not identical configuration (Java 8/macOs 10.12 with build/mvn
-Phive -Phive-thriftserver -Phadoop-2.7 -Pyarn clean install);
Similar but not identical failure:

...

- line wrapper only initialized once when used as encoder outer scope

Spark context available as 'sc' (master = local-cluster[1,1,1024], app id =
app-20160923150640-).

Spark session available as 'spark'.

Exception in thread "dispatcher-event-loop-1" java.lang.OutOfMemoryError:
GC overhead limit exceeded

Exception in thread "dispatcher-event-loop-7" java.lang.OutOfMemoryError:
GC overhead limit exceeded

- define case class and create Dataset together with paste mode

java.lang.OutOfMemoryError: GC overhead limit exceeded

- should clone and clean line object in ClosureCleaner *** FAILED ***

  java.util.concurrent.TimeoutException: Futures timed out after [10
minutes]

...


On Fri, Sep 23, 2016 at 3:08 PM, Sean Owen  wrote:

> +1 Signatures and hashes check out. I checked that the Kinesis
> assembly artifacts are not present.
>
> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
> problem. This test never completed. If nobody else sees it, +1,
> assuming it's a bad test or env issue.
>
> - should clone and clean line object in ClosureCleaner *** FAILED ***
>   isContain was true Interpreter output contained 'Exception':
>   Welcome to
>   __
>/ __/__  ___ _/ /__
>   _\ \/ _ \/ _ `/ __/  '_/
>  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
> /_/
>
>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>   Type in expressions to have them evaluated.
>   Type :help for more information.
>
>   scala> // Entering paste mode (ctrl-D to finish)
>
>
>   // Exiting paste mode, now interpreting.
>
>   org.apache.spark.SparkException: Job 0 cancelled because
> SparkContext was shut down
> at org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfte
> rSchedulerStop$1.apply(DAGScheduler.scala:818)
> ...
>
>
> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
> passes
> > if a majority of at least 3+1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 2.0.1
> > [ ] -1 Do not release this package because ...
> >
> >
> > The tag to be voted on is v2.0.1-rc2
> > (04141ad49806a48afccc236b699827997142bd57)
> >
> > This release candidate resolves 284 issues:
> > https://s.apache.org/spark-2.0.1-jira
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1199
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
> >
> >
> > Q: How can I help test this release?
> > A: If you are a Spark user, you can help us test this release by taking
> an
> > existing Spark workload and running on this release candidate, then
> > reporting any regressions from 2.0.0.
> >
> > Q: What justifies a -1 vote for this release?
> > A: This is a maintenance release in the 2.0.x series.  Bugs already
> present
> > in 2.0.0, missing features, or bugs related to new features will not
> > necessarily block this release.
> >
> > Q: What happened to 2.0.1 RC1?
> > A: There was an issue with RC1 R documentation during release candidate
> > preparation. As a result, rc1 was canceled before a vote was called.
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-23 Thread Mark Hamstra
Similar but not identical configuration (Java 8/macOs 10.12 with build/mvn
-Phive -Phive-thriftserver -Phadoop-2.7 -Pyarn clean install);
Similar but not identical failure:

...

- line wrapper only initialized once when used as encoder outer scope

Spark context available as 'sc' (master = local-cluster[1,1,1024], app id =
app-20160923150640-).

Spark session available as 'spark'.

Exception in thread "dispatcher-event-loop-1" java.lang.OutOfMemoryError:
GC overhead limit exceeded

Exception in thread "dispatcher-event-loop-7" java.lang.OutOfMemoryError:
GC overhead limit exceeded

- define case class and create Dataset together with paste mode

java.lang.OutOfMemoryError: GC overhead limit exceeded

- should clone and clean line object in ClosureCleaner *** FAILED ***

  java.util.concurrent.TimeoutException: Futures timed out after [10
minutes]

...


On Fri, Sep 23, 2016 at 3:08 PM, Sean Owen  wrote:

> +1 Signatures and hashes check out. I checked that the Kinesis
> assembly artifacts are not present.
>
> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
> problem. This test never completed. If nobody else sees it, +1,
> assuming it's a bad test or env issue.
>
> - should clone and clean line object in ClosureCleaner *** FAILED ***
>   isContain was true Interpreter output contained 'Exception':
>   Welcome to
>   __
>/ __/__  ___ _/ /__
>   _\ \/ _ \/ _ `/ __/  '_/
>  /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
> /_/
>
>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>   Type in expressions to have them evaluated.
>   Type :help for more information.
>
>   scala> // Entering paste mode (ctrl-D to finish)
>
>
>   // Exiting paste mode, now interpreting.
>
>   org.apache.spark.SparkException: Job 0 cancelled because
> SparkContext was shut down
> at org.apache.spark.scheduler.DAGScheduler$$anonfun$
> cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
> ...
>
>
> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin  wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
> passes
> > if a majority of at least 3+1 PMC votes are cast.
> >
> > [ ] +1 Release this package as Apache Spark 2.0.1
> > [ ] -1 Do not release this package because ...
> >
> >
> > The tag to be voted on is v2.0.1-rc2
> > (04141ad49806a48afccc236b699827997142bd57)
> >
> > This release candidate resolves 284 issues:
> > https://s.apache.org/spark-2.0.1-jira
> >
> > The release files, including signatures, digests, etc. can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
> >
> > Release artifacts are signed with the following key:
> > https://people.apache.org/keys/committer/pwendell.asc
> >
> > The staging repository for this release can be found at:
> > https://repository.apache.org/content/repositories/orgapachespark-1199
> >
> > The documentation corresponding to this release can be found at:
> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
> >
> >
> > Q: How can I help test this release?
> > A: If you are a Spark user, you can help us test this release by taking
> an
> > existing Spark workload and running on this release candidate, then
> > reporting any regressions from 2.0.0.
> >
> > Q: What justifies a -1 vote for this release?
> > A: This is a maintenance release in the 2.0.x series.  Bugs already
> present
> > in 2.0.0, missing features, or bugs related to new features will not
> > necessarily block this release.
> >
> > Q: What happened to 2.0.1 RC1?
> > A: There was an issue with RC1 R documentation during release candidate
> > preparation. As a result, rc1 was canceled before a vote was called.
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-23 Thread Ricardo Almeida
+1 (non-binding)

Build:
OK, but can no longer use the "--tgz" option when
calling make-distribution.sh (maybe a problem on my side?)

Run:
No regressions from 2.0.0 detected. Tested our pipelines on a standalone
cluster (Python API)



On 23 September 2016 at 08:01, Reynold Xin  wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and passes
> if a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc2 (04141ad49806a48afccc236b69982
> 7997142bd57)
>
> This release candidate resolves 284 issues: https://s.apache.org/spark-2.0
> .1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1199
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already
> present in 2.0.0, missing features, or bugs related to new features will
> not necessarily block this release.
>
> Q: What happened to 2.0.1 RC1?
> A: There was an issue with RC1 R documentation during release candidate
> preparation. As a result, rc1 was canceled before a vote was called.
>
>


Re: [VOTE] Release Apache Spark 2.0.1 (RC2)

2016-09-23 Thread Jacek Laskowski
+1

Pozdrawiam,
Jacek Laskowski

https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Sep 23, 2016 at 8:01 AM, Reynold Xin  wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and passes
> if a majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache Spark 2.0.1
> [ ] -1 Do not release this package because ...
>
>
> The tag to be voted on is v2.0.1-rc2
> (04141ad49806a48afccc236b699827997142bd57)
>
> This release candidate resolves 284 issues:
> https://s.apache.org/spark-2.0.1-jira
>
> The release files, including signatures, digests, etc. can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
>
> Release artifacts are signed with the following key:
> https://people.apache.org/keys/committer/pwendell.asc
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1199
>
> The documentation corresponding to this release can be found at:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
>
>
> Q: How can I help test this release?
> A: If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions from 2.0.0.
>
> Q: What justifies a -1 vote for this release?
> A: This is a maintenance release in the 2.0.x series.  Bugs already present
> in 2.0.0, missing features, or bugs related to new features will not
> necessarily block this release.
>
> Q: What happened to 2.0.1 RC1?
> A: There was an issue with RC1 R documentation during release candidate
> preparation. As a result, rc1 was canceled before a vote was called.
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org