[ 
https://issues.apache.org/jira/browse/SPARK-12607?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15080499#comment-15080499
 ] 

SM Wang commented on SPARK-12607:
---------------------------------

Sure.  I think the problem is with the while loop's delimiter setting (-d '')  
or the launcher class' behavior in the MSYS64 environment.

Here is section of the script in version 1.4.0 where I added some echo commands 
(marked with the +++ prefix).

CMD=()
while IFS= read -d '' -r ARG; do
        echo "+++ Parsed Arguments in while loop: $ARG"
  CMD+=("$ARG")
done < <("$RUNNER" -cp "$LAUNCH_CLASSPATH" org.apache.spark.launcher.Main "$@")
echo "+++ Launcher Command" "$RUNNER" -cp "$LAUNCH_CLASSPATH" 
org.apache.spark.launcher.Main "$@"

echo "+++ First Element: ${CMD[0]}"
echo "+++ Command Array: ${CMD[@]}"

if [ "${CMD[0]}" = "usage" ]; then
  "${CMD[@]}"
else
  exec "${CMD[@]}"
fi

The output from "run-example SparkPi" is as follows:

+++ Launcher Command /apps/jdk1.7.0_80/bin/java -cp 
/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --master 
local[*] --class org.apache.spark.examples.SparkPi 
/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-examples-1.4.0-hadoop2.4.0.jar
+++ First Element:
+++ Command Array:

As you can see the command array is empty.

However, when running the launcher command manually I got the following:

C:/msys64/apps/jdk1.7.0_80\bin\java -cp 
"C:/msys64/apps/tmp/spark-1.4.0-bin-hadoop2.4\conf\;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\spark-assembly-1.4.0-hadoop2.4.0.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-api-jdo-3.2.6.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-core-3.2.10.jar;C:\msys64\apps\tmp\spark-1.4.0-bin-hadoop2.4\lib\datanucleus-rdbms-3.2.9.jar"
 -Xms512m -Xmx512m "-XX:MaxPermSize=128m" org.apache.spark.deploy.SparkSubmit 
--master local[*] --class org.apache.spark.examples.SparkPi 
C:/msys64/apps/tmp/spark-1.4.0-bin-hadoop2.4/lib/spark-examples-1.4.0-hadoop2.4.0.jar

When I change the delimiter to "-d ' '" I was able to get an non-empty command 
array.

This is why I think the issue is either with the delimiter setting or the 
launcher that does not produce the command staring with the expected delimiter.

Hope this helps.

Thank you for looking into this.

> spark-class produced null command strings for "exec"
> ----------------------------------------------------
>
>                 Key: SPARK-12607
>                 URL: https://issues.apache.org/jira/browse/SPARK-12607
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.4.0, 1.4.1, 1.5.2
>         Environment: MSYS64 on Windows 7 64 bit
>            Reporter: SM Wang
>
> When using the run-example script in 1.4.0 to run the SparkPi example, I 
> found that it did not print any text to the terminal (e.g., stdout, stderr). 
> After further investigation I found the while loop for producing the exec 
> command from the launcher class produced a null command array.
> This discrepancy was observed on 1.5.2 and 1.4.1.  The 1.3.1's behavior seems 
> to be correct.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to