There is no "mesos" profile in 2.0.1.

On Sat, Sep 24, 2016 at 2:19 PM, Jacek Laskowski <ja...@japila.pl> wrote:
> Hi,
>
> I keep asking myself why are you guys not including -Pmesos in your
> builds? Is this on purpose or have you overlooked it?
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Sep 24, 2016 at 9:25 PM, Dongjoon Hyun <dongj...@apache.org> wrote:
>> +1 (non binding)
>>
>> I compiled and tested on the following two systems.
>>
>> - CentOS 7.2 / Oracle JDK 1.8.0_77 / R 3.3.1 with -Pyarn -Phadoop-2.7
>> -Pkinesis-asl -Phive -Phive-thriftserver -Dsparkr
>> - CentOS 7.2 / Open JDK 1.8.0_102 with -Pyarn -Phadoop-2.7 -Pkinesis-asl
>> -Phive -Phive-thriftserver
>>
>> Bests,
>> Dongjoon.
>>
>>
>> On Fri, Sep 23, 2016 at 3:32 PM, Jacek Laskowski <ja...@japila.pl> wrote:
>>>
>>> Hi,
>>>
>>> Not that it could fix the issue but no -Pmesos?
>>>
>>> Jacek
>>>
>>>
>>> On 24 Sep 2016 12:08 a.m., "Sean Owen" <so...@cloudera.com> wrote:
>>>>
>>>> +1 Signatures and hashes check out. I checked that the Kinesis
>>>> assembly artifacts are not present.
>>>>
>>>> I compiled and tested on Java 8 / Ubuntu 16 with -Pyarn -Phive
>>>> -Phive-thriftserver -Phadoop-2.7 -Psparkr and only saw one test
>>>> problem. This test never completed. If nobody else sees it, +1,
>>>> assuming it's a bad test or env issue.
>>>>
>>>> - should clone and clean line object in ClosureCleaner *** FAILED ***
>>>>   isContain was true Interpreter output contained 'Exception':
>>>>   Welcome to
>>>>         ____              __
>>>>        / __/__  ___ _____/ /__
>>>>       _\ \/ _ \/ _ `/ __/  '_/
>>>>      /___/ .__/\_,_/_/ /_/\_\   version 2.0.1
>>>>         /_/
>>>>
>>>>   Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_91)
>>>>   Type in expressions to have them evaluated.
>>>>   Type :help for more information.
>>>>
>>>>   scala> // Entering paste mode (ctrl-D to finish)
>>>>
>>>>
>>>>   // Exiting paste mode, now interpreting.
>>>>
>>>>   org.apache.spark.SparkException: Job 0 cancelled because
>>>> SparkContext was shut down
>>>>     at
>>>> org.apache.spark.scheduler.DAGScheduler$$anonfun$cleanUpAfterSchedulerStop$1.apply(DAGScheduler.scala:818)
>>>> ...
>>>>
>>>>
>>>> On Fri, Sep 23, 2016 at 7:01 AM, Reynold Xin <r...@databricks.com> wrote:
>>>> > Please vote on releasing the following candidate as Apache Spark
>>>> > version
>>>> > 2.0.1. The vote is open until Sunday, Sep 25, 2016 at 23:59 PDT and
>>>> > passes
>>>> > if a majority of at least 3+1 PMC votes are cast.
>>>> >
>>>> > [ ] +1 Release this package as Apache Spark 2.0.1
>>>> > [ ] -1 Do not release this package because ...
>>>> >
>>>> >
>>>> > The tag to be voted on is v2.0.1-rc2
>>>> > (04141ad49806a48afccc236b699827997142bd57)
>>>> >
>>>> > This release candidate resolves 284 issues:
>>>> > https://s.apache.org/spark-2.0.1-jira
>>>> >
>>>> > The release files, including signatures, digests, etc. can be found at:
>>>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-bin/
>>>> >
>>>> > Release artifacts are signed with the following key:
>>>> > https://people.apache.org/keys/committer/pwendell.asc
>>>> >
>>>> > The staging repository for this release can be found at:
>>>> > https://repository.apache.org/content/repositories/orgapachespark-1199
>>>> >
>>>> > The documentation corresponding to this release can be found at:
>>>> > http://people.apache.org/~pwendell/spark-releases/spark-2.0.1-rc2-docs/
>>>> >
>>>> >
>>>> > Q: How can I help test this release?
>>>> > A: If you are a Spark user, you can help us test this release by taking
>>>> > an
>>>> > existing Spark workload and running on this release candidate, then
>>>> > reporting any regressions from 2.0.0.
>>>> >
>>>> > Q: What justifies a -1 vote for this release?
>>>> > A: This is a maintenance release in the 2.0.x series.  Bugs already
>>>> > present
>>>> > in 2.0.0, missing features, or bugs related to new features will not
>>>> > necessarily block this release.
>>>> >
>>>> > Q: What happened to 2.0.1 RC1?
>>>> > A: There was an issue with RC1 R documentation during release candidate
>>>> > preparation. As a result, rc1 was canceled before a vote was called.
>>>> >
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to