Yes.
That could be the cause.
On Sun, Jan 18, 2015 at 11:47 AM, Sean Owen wrote:
> Oh: are you running the tests with a different profile setting than
> what the last assembly was built with? this particular test depends on
> those matching. Not 100% sure that's the problem, but a good guess.
>
Oh: are you running the tests with a different profile setting than
what the last assembly was built with? this particular test depends on
those matching. Not 100% sure that's the problem, but a good guess.
On Sat, Jan 17, 2015 at 4:54 PM, Ted Yu wrote:
> The test passed here:
>
> https://amplab.
The test passed here:
https://amplab.cs.berkeley.edu/jenkins/view/Spark/job/Spark-Master-Maven-with-YARN/HADOOP_PROFILE=hadoop-2.4,label=centos/1215/consoleFull
It passed locally with the following command:
mvn -DHADOOP_PROFILE=hadoop-2.4 -Phadoop-2.4 -Pyarn -Phive test
-Dtest=JavaAPISuite
FYI
Failing for me and another team member on the command line, for what it's worth.
> On Jan 17, 2015, at 2:39 AM, Sean Owen wrote:
>
> Hm, this test hangs for me in IntelliJ. It could be a real problem,
> and a combination of a) just recently actually enabling Java tests, b)
> recent updates to th
Hm, this test hangs for me in IntelliJ. It could be a real problem,
and a combination of a) just recently actually enabling Java tests, b)
recent updates to the complicated Guava shading situation.
The manifestation of the error usually suggests that something totally
failed to start (because of,
I tried the following but still didn't see test output :-(
diff --git a/pom.xml b/pom.xml
index f4466e5..dae2ae8 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1131,6 +1131,7 @@
true
false
+true
On Fri, Jan 16, 2015 at 12:41 PM, Ted
I got the same error:
testGuavaOptional(org.apache.spark.JavaAPISuite) Time elapsed: 261.111 sec
<<< ERROR!
org.apache.spark.SparkException: Job aborted due to stage failure: Master
removed our application: FAILED
at org.apache.spark.scheduler.DAGScheduler.org
$apache$spark$scheduler$DAGSchedule
Thanks Sean
On Fri, Jan 16, 2015 at 12:06 PM, Sean Owen wrote:
> Hey Andrew, you'll want to have a look at the Spark docs on building:
> http://spark.apache.org/docs/latest/building-spark.html
>
> It's the first thing covered there.
>
> The warnings are normal as you are probably building with n
Thanks Ted, got farther along but now have a failing test; is this a known
issue?
---
T E S T S
---
Running org.apache.spark.JavaAPISuite
Tests run: 72, Failures: 0, Errors: 1, Skipped: 0, Time
Can you try doing this before running mvn ?
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M
-XX:ReservedCodeCacheSize=512m"
What OS are you using ?
Cheers
On Fri, Jan 16, 2015 at 12:03 PM, Andrew Musselman <
andrew.mussel...@gmail.com> wrote:
> Just got the latest from Github and tried running
Hey Andrew, you'll want to have a look at the Spark docs on building:
http://spark.apache.org/docs/latest/building-spark.html
It's the first thing covered there.
The warnings are normal as you are probably building with newer Hadoop
profiles and so old-Hadoop support code shows deprecation warnin
Just got the latest from Github and tried running `mvn test`; is this error
common and do you have any advice on fixing it?
Thanks!
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @
spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal
inc
12 matches
Mail list logo