Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/3864#issuecomment-68505709
  
    That ContextCleanerSuite fails locally on my machine, too:
    
    ```
    [info] ContextCleanerSuite:
    [info] - cleanup RDD
    [info] - cleanup shuffle
    [info] - cleanup broadcast
    [info] - automatically cleanup RDD
    [info] - automatically cleanup shuffle
    [info] - automatically cleanup broadcast
    [info] - automatically cleanup RDD + shuffle + broadcast
    [info] - automatically cleanup RDD + shuffle + broadcast in distributed 
mode *** FAILED ***
    [info]   org.apache.spark.SparkException: Job aborted due to stage failure: 
Master removed our application: FAILED
    [info]   at 
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1054)
    [info]   at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1038)
    [info]   at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1036)
    [info]   at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
    [info]   at 
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
    [info]   at 
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1036)
    [info]   at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
    [info]   at 
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:635)
    [info]   at scala.Option.foreach(Option.scala:236)
    [info]   at 
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:635)
    [info]   ...
    [info] ScalaTest
    ```
    
    It turns out that this test uses `local-cluster` mode and it looks like 
it's failing because executors can't be launched.
    
    Here's a log from Jenkins:
    
    ```
    14/12/31 18:23:31.752 ERROR ExecutorRunner: Error running executor
    java.io.IOException: Cannot run program 
"/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh"
 (in directory "."): error=2, No such file or directory
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
        at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:759)
        at 
org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:72)
        at 
org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:37)
        at 
org.apache.spark.deploy.worker.ExecutorRunner.getCommandSeq(ExecutorRunner.scala:110)
        at 
org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:125)
        at 
org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:58)
    Caused by: java.io.IOException: error=2, No such file or directory
        at java.lang.UNIXProcess.forkAndExec(Native Method)
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
        at java.lang.ProcessImpl.start(ProcessImpl.java:130)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
        ... 6 more
    14/12/31 18:23:31.753 INFO Worker: Asked to launch executor 
app-20141231182331-0000/1 for ContextCleanerSuite
    14/12/31 18:23:31.756 INFO Master: Removing executor 
app-20141231182331-0000/0 because it is FAILED
    14/12/31 18:23:31.756 INFO Worker: Executor app-20141231182331-0000/0 
finished with state FAILED message java.io.IOException: Cannot run program 
"/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh"
 (in directory "."): error=2, No such file or directory
    14/12/31 18:23:31.757 INFO Master: Launching executor 
app-20141231182331-0000/2 on worker worker-20141231182331-localhost-51737
    14/12/31 18:23:31.757 ERROR ExecutorRunner: Error running executor
    java.io.IOException: Cannot run program 
"/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh"
 (in directory "."): error=2, No such file or directory
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
        at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:759)
        at 
org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:72)
        at 
org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:37)
        at 
org.apache.spark.deploy.worker.ExecutorRunner.getCommandSeq(ExecutorRunner.scala:110)
        at 
org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:125)
        at 
org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:58)
    Caused by: java.io.IOException: error=2, No such file or directory
        at java.lang.UNIXProcess.forkAndExec(Native Method)
        at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
        at java.lang.ProcessImpl.start(ProcessImpl.java:130)
        at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
        ... 6 more
    ```
    
    In this case, it looks like the path to `compute-classpath` is wrong, since 
it includes `core`:
    
    ```
    
/home/jenkins/workspace/SparkPullRequestBuilder/core/./bin/compute-classpath.sh
    ```
    
    This suggests that SPARK_HOME may not be set properly for these tests:
    
    
https://github.com/apache/spark/blob/4fcfd502c475b9127e8b5e44e60c9ae2af53f8a4/core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala#L72
    
    We hit this same error at #3850, which suggests that maybe the SBT build 
for `branch-1.0` is broken.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to