seekerak created SPARK-4360:
-------------------------------

             Summary: task only execute on one node when spark on yarn
                 Key: SPARK-4360
                 URL: https://issues.apache.org/jira/browse/SPARK-4360
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.0.2
            Reporter: seekerak


hadoop version: hadoop 2.0.3-alpha
spark version: 1.0.2

when i run spark jobs on yarn, i found all the task only run on one node, my 
cluster has 4 nodes, executors has 3, but only one has task, the others hasn't, 
my command like this :

/opt/hadoopcluster/spark-1.0.2-bin-hadoop2/bin/spark-submit --class 
org.sr.scala.Spark_LineCount_G0 --executor-memory 2G --num-executors 12 
--master yarn-cluster /home/Spark_G0.jar /data /output/ou_1

is there any one knows why?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to