Hi,
I am trying to set up a spark development environment. I forked the spark git 
project and cloned the fork. I then checked out branch-2.0 tag (which I assume 
is the released source code).
I then compiled spark twice.
The first using:
mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package
This compiled successfully.
The second using mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 clean package
This got a failure in Spark Project Core with the following test failing:
- caching in memory and disk, replicated
- caching in memory and disk, serialized, replicated *** FAILED ***
  java.util.concurrent.TimeoutException: Can't find 2 executors before 30000 
milliseconds elapsed
  at 
org.apache.spark.ui.jobs.JobProgressListener.waitUntilExecutorsUp(JobProgressListener.scala:573)
  at 
org.apache.spark.DistributedSuite.org$apache$spark$DistributedSuite$$testCaching(DistributedSuite.scala:154)
  at 
org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply$mcV$sp(DistributedSuite.scala:191)
  at 
org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(DistributedSuite.scala:191)
  at 
org.apache.spark.DistributedSuite$$anonfun$32$$anonfun$apply$1.apply(DistributedSuite.scala:191)
  at 
org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
  at org.scalatest.Transformer.apply(Transformer.scala:22)
  at org.scalatest.Transformer.apply(Transformer.scala:20)
  ...
- compute without caching when no partitions fit in memory

I made no changes to the code whatsoever. Can anyone help me figure out what is 
wrong with my environment?
BTW I am using maven 3.3.9 and java 1.8.0_101-b13

Thanks,
                Assaf




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Test-fails-when-compiling-spark-with-tests-tp18919.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Reply via email to