[ 
https://issues.apache.org/jira/browse/SPARK-5754?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Inigo updated SPARK-5754:
-------------------------
    Description: 
I'm trying to run Spark Pi on a YARN cluster running on Windows and the AM 
container fails to start. The problem seems to be in the generation of the YARN 
command which adds single quotes (') surrounding some of the java options. In 
particular, the part of the code that is adding those is the escapeForShell 
function in YarnSparkHadoopUtil. Apparently, Windows does not like the quotes 
for these options. Here is an example of the command that the container tries 
to execute:

@call %JAVA_HOME%/bin/java -server -Xmx512m -Djava.io.tmpdir=%PWD%/tmp 
'-Dspark.yarn.secondary.jars=' 
'-Dspark.app.name=org.apache.spark.examples.SparkPi' 
'-Dspark.master=yarn-cluster' org.apache.spark.deploy.yarn.ApplicationMaster 
--class 'org.apache.spark.examples.SparkPi' --jar  
'file:/D:/data/spark-1.1.1-bin-hadoop2.4/bin/../lib/spark-examples-1.1.1-hadoop2.4.0.jar'
  --executor-memory 1024 --executor-cores 1 --num-executors 2

Once I transform it into:

@call %JAVA_HOME%/bin/java -server -Xmx512m -Djava.io.tmpdir=%PWD%/tmp 
-Dspark.yarn.secondary.jars= -Dspark.app.name=org.apache.spark.examples.SparkPi 
-Dspark.master=yarn-cluster org.apache.spark.deploy.yarn.ApplicationMaster 
--class 'org.apache.spark.examples.SparkPi' --jar  
'file:/D:/data/spark-1.1.1-bin-hadoop2.4/bin/../lib/spark-examples-1.1.1-hadoop2.4.0.jar'
  --executor-memory 1024 --executor-cores 1 --num-executors 2

Everything seems to start.

How should I deal with this? Creating a separate function like escapeForShell 
for Windows and call it whenever I detect this is for Windows? Or should I add 
some sanity check on YARN?

I checked a little and there seems to be people that is able to run Spark on 
YARN on Windows, so it might be something else. I didn't find anything related 
on Jira either.

  was:
I'm trying to run Spark Pi on a YARN cluster running on Windows and the AM 
container fails to start. The problem seems to be in the generation of the YARN 
command which adds single quotes (') surrounding some of the java options. In 
particular the one that is adding those is the escapeForShell function in 
YarnSparkHadoopUtil. Apparently, Windows does not like the quotes for the 
option. Here is an example of the command that the container tries to execute:

@call %JAVA_HOME%/bin/java -server -Xmx512m -Djava.io.tmpdir=%PWD%/tmp 
'-Dspark.yarn.secondary.jars=' 
'-Dspark.app.name=org.apache.spark.examples.SparkPi' 
'-Dspark.master=yarn-cluster' org.apache.spark.deploy.yarn.ApplicationMaster 
--class 'org.apache.spark.examples.SparkPi' --jar  
'file:/D:/data/spark-1.1.1-bin-hadoop2.4/bin/../lib/spark-examples-1.1.1-hadoop2.4.0.jar'
  --executor-memory 1024 --executor-cores 1 --num-executors 2

Once I transform it into:

@call %JAVA_HOME%/bin/java -server -Xmx512m -Djava.io.tmpdir=%PWD%/tmp 
-Dspark.yarn.secondary.jars= -Dspark.app.name=org.apache.spark.examples.SparkPi 
-Dspark.master=yarn-cluster org.apache.spark.deploy.yarn.ApplicationMaster 
--class 'org.apache.spark.examples.SparkPi' --jar  
'file:/D:/data/spark-1.1.1-bin-hadoop2.4/bin/../lib/spark-examples-1.1.1-hadoop2.4.0.jar'
  --executor-memory 1024 --executor-cores 1 --num-executors 2

Everything seems to start.

How should I deal with this? Creating a separate function like escapeForShell 
for Windows and call it whenever I detect this is for Windows? Or should I add 
some sanity check on YARN?

I checked a little and there seems to be people that is able to run Spark on 
YARN on Windows, so it might be something else. I didn't find anything related 
on Jira either.


> Spark AM not launching on Windows
> ---------------------------------
>
>                 Key: SPARK-5754
>                 URL: https://issues.apache.org/jira/browse/SPARK-5754
>             Project: Spark
>          Issue Type: Bug
>          Components: Windows, YARN
>    Affects Versions: 1.1.1, 1.2.0
>         Environment: Windows Server 2012, Hadoop 2.4.1.
>            Reporter: Inigo
>
> I'm trying to run Spark Pi on a YARN cluster running on Windows and the AM 
> container fails to start. The problem seems to be in the generation of the 
> YARN command which adds single quotes (') surrounding some of the java 
> options. In particular, the part of the code that is adding those is the 
> escapeForShell function in YarnSparkHadoopUtil. Apparently, Windows does not 
> like the quotes for these options. Here is an example of the command that the 
> container tries to execute:
> @call %JAVA_HOME%/bin/java -server -Xmx512m -Djava.io.tmpdir=%PWD%/tmp 
> '-Dspark.yarn.secondary.jars=' 
> '-Dspark.app.name=org.apache.spark.examples.SparkPi' 
> '-Dspark.master=yarn-cluster' org.apache.spark.deploy.yarn.ApplicationMaster 
> --class 'org.apache.spark.examples.SparkPi' --jar  
> 'file:/D:/data/spark-1.1.1-bin-hadoop2.4/bin/../lib/spark-examples-1.1.1-hadoop2.4.0.jar'
>   --executor-memory 1024 --executor-cores 1 --num-executors 2
> Once I transform it into:
> @call %JAVA_HOME%/bin/java -server -Xmx512m -Djava.io.tmpdir=%PWD%/tmp 
> -Dspark.yarn.secondary.jars= 
> -Dspark.app.name=org.apache.spark.examples.SparkPi 
> -Dspark.master=yarn-cluster org.apache.spark.deploy.yarn.ApplicationMaster 
> --class 'org.apache.spark.examples.SparkPi' --jar  
> 'file:/D:/data/spark-1.1.1-bin-hadoop2.4/bin/../lib/spark-examples-1.1.1-hadoop2.4.0.jar'
>   --executor-memory 1024 --executor-cores 1 --num-executors 2
> Everything seems to start.
> How should I deal with this? Creating a separate function like escapeForShell 
> for Windows and call it whenever I detect this is for Windows? Or should I 
> add some sanity check on YARN?
> I checked a little and there seems to be people that is able to run Spark on 
> YARN on Windows, so it might be something else. I didn't find anything 
> related on Jira either.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to