+1 , tested with hadoop 2.6/ yarn on centos 6.5 after building w/ -Pyarn -Phadoop-2.4 -Dhadoop.version=2.6.0 -Phive -Phive-thriftserver and ran a few SQL tests and the ML examples
On Fri, Jun 5, 2015 at 10:55 AM, Hari Shreedharan <hshreedha...@cloudera.com > wrote: > +1. Build looks good, ran a couple apps on YARN > > > Thanks, > Hari > > On Fri, Jun 5, 2015 at 10:52 AM, Yin Huai <yh...@databricks.com> wrote: > >> Sean, >> >> Can you add "-Phive -Phive-thriftserver" and try those Hive tests? >> >> Thanks, >> >> Yin >> >> On Fri, Jun 5, 2015 at 5:19 AM, Sean Owen <so...@cloudera.com> wrote: >> >>> Everything checks out again, and the tests pass for me on Ubuntu + >>> Java 7 with '-Pyarn -Phadoop-2.6', except that I always get >>> SparkSubmitSuite errors like ... >>> >>> - success sanity check *** FAILED *** >>> java.lang.RuntimeException: [download failed: >>> org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed: >>> commons-net#commons-net;3.1!commons-net.jar] >>> at >>> org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:978) >>> at >>> org.apache.spark.sql.hive.client.IsolatedClientLoader$$anonfun$3.apply(IsolatedClientLoader.scala:62) >>> ... >>> >>> I also can't get hive tests to pass. Is anyone else seeing anything >>> like this? if not I'll assume this is something specific to the env -- >>> or that I don't have the build invocation just right. It's puzzling >>> since it's so consistent, but I presume others' tests pass and Jenkins >>> does. >>> >>> >>> On Wed, Jun 3, 2015 at 5:53 AM, Patrick Wendell <pwend...@gmail.com> >>> wrote: >>> > Please vote on releasing the following candidate as Apache Spark >>> version 1.4.0! >>> > >>> > The tag to be voted on is v1.4.0-rc3 (commit 22596c5): >>> > https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h= >>> > 22596c534a38cfdda91aef18aa9037ab101e4251 >>> > >>> > The release files, including signatures, digests, etc. can be found at: >>> > http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-bin/ >>> > >>> > Release artifacts are signed with the following key: >>> > https://people.apache.org/keys/committer/pwendell.asc >>> > >>> > The staging repository for this release can be found at: >>> > [published as version: 1.4.0] >>> > >>> https://repository.apache.org/content/repositories/orgapachespark-1111/ >>> > [published as version: 1.4.0-rc4] >>> > >>> https://repository.apache.org/content/repositories/orgapachespark-1112/ >>> > >>> > The documentation corresponding to this release can be found at: >>> > >>> http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc4-docs/ >>> > >>> > Please vote on releasing this package as Apache Spark 1.4.0! >>> > >>> > The vote is open until Saturday, June 06, at 05:00 UTC and passes >>> > if a majority of at least 3 +1 PMC votes are cast. >>> > >>> > [ ] +1 Release this package as Apache Spark 1.4.0 >>> > [ ] -1 Do not release this package because ... >>> > >>> > To learn more about Apache Spark, please see >>> > http://spark.apache.org/ >>> > >>> > == What has changed since RC3 == >>> > In addition to may smaller fixes, three blocker issues were fixed: >>> > 4940630 [SPARK-8020] [SQL] Spark SQL conf in spark-defaults.conf make >>> > metadataHive get constructed too early >>> > 6b0f615 [SPARK-8038] [SQL] [PYSPARK] fix Column.when() and otherwise() >>> > 78a6723 [SPARK-7978] [SQL] [PYSPARK] DecimalType should not be >>> singleton >>> > >>> > == How can I help test this release? == >>> > If you are a Spark user, you can help us test this release by >>> > taking a Spark 1.3 workload and running on this release candidate, >>> > then reporting any regressions. >>> > >>> > == What justifies a -1 vote for this release? == >>> > This vote is happening towards the end of the 1.4 QA period, >>> > so -1 votes should only occur for significant regressions from 1.3.1. >>> > Bugs already present in 1.3.X, minor regressions, or bugs related >>> > to new features will not block this release. >>> > >>> > --------------------------------------------------------------------- >>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org >>> > For additional commands, e-mail: dev-h...@spark.apache.org >>> > >>> >>> --------------------------------------------------------------------- >>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org >>> For additional commands, e-mail: dev-h...@spark.apache.org >>> >>> >> >