Hi everyone,
I think there's a blocker on PySpark the when functions in python seems
to be broken but the Scala API seems fine.
Here's a snippet demonstrating that with Spark 1.4.0 RC3 :
In [*1*]: df = sqlCtx.createDataFrame([(1, 1), (2, 2), (1, 2), (1,
2)], [key, value])
In [*2*]: from
This vote is cancelled in favor of RC4.
Thanks everyone for the thorough testing of this RC. We are really
close, but there were a few blockers found. I've cut a new RC to
incorporate those issues.
The following patches were merged during the RC3 testing period:
(blockers)
4940630 [SPARK-8020]
;
*Date: * Mon, Jun 1, 2015 07:34 AM
*To: * Krishna Sankarksanka...@gmail.com;
*Cc: * Patrick Wendellpwend...@gmail.com;
dev@spark.apache.orgdev@spark.apache.org;
*Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC3)
+1 (non-binding)
Launched against a pseudo-distributed YARN cluster running
: * Mon, Jun 1, 2015 07:34 AM
*To: * Krishna Sankarksanka...@gmail.com ksanka...@gmail.com;
*Cc: * Patrick Wendellpwend...@gmail.com pwend...@gmail.com;
dev@spark.apache.org dev@spark.apache.orgdev@spark.apache.org
dev@spark.apache.org;
*Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC3)
+1
Wendellpwend...@gmail.com pwend...@gmail.com;
dev@spark.apache.org dev@spark.apache.orgdev@spark.apache.org
dev@spark.apache.org;
*Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC3)
+1 (non-binding)
Launched against a pseudo-distributed YARN cluster running Hadoop 2.6.0
and ran some
@spark.apache.org
mailto:dev@spark.apache.orgdev@spark.apache.org
mailto:dev@spark.apache.org;
*Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC3)
+1 (non-binding)
Launched against a pseudo-distributed YARN cluster running Hadoop
2.6.0 and ran some jobs.
-Sandy
ksanka...@gmail.com;
*Cc: * Patrick Wendell pwend...@gmail.compwend...@gmail.com
pwend...@gmail.com; dev@spark.apache.org dev@spark.apache.org
dev@spark.apache.orgdev@spark.apache.org dev@spark.apache.org;
*Subject: * Re: [VOTE] Release Apache Spark 1.4.0 (RC3)
+1 (non-binding)
Launched
I get a bunch of failures in VersionSuite with build/test params
-Pyarn -Phive -Phadoop-2.6:
- success sanity check *** FAILED ***
java.lang.RuntimeException: [download failed:
org.jboss.netty#netty;3.2.2.Final!netty.jar(bundle), download failed:
commons-net#commons-net;3.1!commons-net.jar]
Hive Context works on RC3 for Mapr after adding
spark.sql.hive.metastore.sharedPrefixes as suggested in SPARK-7819
https://issues.apache.org/jira/browse/SPARK-7819. However, there still
seems to be some other issues with native libraries, i get below warning
WARN NativeCodeLoader: Unable to load
Hey Bobby,
Those are generic warnings that the hadoop libraries throw. If you are
using MapRFS they shouldn't matter since you are using the MapR client
and not the default hadoop client.
Do you have any issues with functionality... or was it just seeing the
warnings that was the concern?
Hi Patrick,
Thanks for clarifying. No issues with functionality.
+1 (non-binding)
Thanks
Bobby
On Mon, Jun 1, 2015 at 9:41 PM, Patrick Wendell pwend...@gmail.com wrote:
Hey Bobby,
Those are generic warnings that the hadoop libraries throw. If you are
using MapRFS they
+1 (non-binding)
Launched against a pseudo-distributed YARN cluster running Hadoop 2.6.0 and
ran some jobs.
-Sandy
On Sat, May 30, 2015 at 3:44 PM, Krishna Sankar ksanka...@gmail.com wrote:
+1 (non-binding, of course)
1. Compiled OSX 10.10 (Yosemite) OK Total time: 17:07 min
mvn clean
+1 (non-binding, of course)
1. Compiled OSX 10.10 (Yosemite) OK Total time: 17:07 min
mvn clean package -Pyarn -Dyarn.version=2.6.0 -Phadoop-2.4
-Dhadoop.version=2.6.0 -DskipTests
2. Tested pyspark, mlib - running as well as compare results with 1.3.1
2.1. statistics
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
The tag to be voted on is v1.4.0-rc3 (commit dd109a8):
https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=dd109a8746ec07c7c83995890fc2c0cd7a693730
The release files, including signatures, digests, etc.
Mike,
The broken Configuration link can be fixed if you add a missing dash '-' on
the first line in docs/configuration.md and run 'jekyll build'.
https://github.com/apache/spark/pull/6513
On Fri, May 29, 2015 at 6:38 PM, Mike Ringenburg mik...@cray.com wrote:
The Configuration link on the
The Configuration link on the docs appears to be broken.
Mike
On May 29, 2015, at 4:41 PM, Patrick Wendell
pwend...@gmail.commailto:pwend...@gmail.com wrote:
Please vote on releasing the following candidate as Apache Spark version 1.4.0!
The tag to be voted on is v1.4.0-rc3 (commit
16 matches
Mail list logo