Github user minixalpha commented on the issue:

    https://github.com/apache/spark/pull/19090
  
    
    I design two groups test cases:
    
     - Test cases about windows command scripts options
     - Examples in Spark Document
    
    All these test cases works well.
    
    ## Test cases about windows command scripts options
    
    All these test cases take `bin\spark-shell` as example, as other commands 
works similarly. For each test case, I record all the java program options when 
run class `org.apache.spark.launcher.Main` and 
`org.apache.spark.deploy.SparkSubmit`, and check the options.
    
    ### No option
    
    ```
    bin\spark-shell
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""C:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell"
    
    C:\jdk1.8.0_65\bin\java -cp 
"C:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;C:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" spark-shell
    ```
    
    ### Has options
    
    #### One option
    
    ##### Option has no parameter
    
    ```
    bin\spark-shell --verbose
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --verbose
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --verbose spark-shell
    ```
    
    ##### Option has parameter
    
    ###### Option parameter has no quotes
    
    ```
    bin\spark-shell --driver-java-options -Dfile.encoding=utf-8
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --driver-java-options 
-Dfile.encoding=utf-8
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g "-Dfile.encoding=utf-8" 
org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.extraJavaOptions=-Dfile.encoding=utf-8" --class 
org.apache.spark.repl.Main --name "Spark shell" spark-shell
    ```
    
    ###### Option parameter has quotes
    - quotes one parameter
    
    ```
    bin\spark-shell --driver-java-options "-Dfile.encoding=utf-8"
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --driver-java-options 
"-Dfile.encoding=utf-8"
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g "-Dfile.encoding=utf-8" 
org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.extraJavaOptions=-Dfile.encoding=utf-8" --class 
org.apache.spark.repl.Main --name "Spark shell" spark-shell
    ```
    
    - quotes multi parameter
    
    ```
    bin\spark-shell --driver-java-options "-Dfile.encoding=utf-8 
-Dsun.jnu.encoding=utf-8"
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --driver-java-options 
"-Dfile.encoding=utf-8 -Dsun.jnu.encoding=utf-8"
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g "-Dfile.encoding=utf-8" 
"-Dsun.jnu.encoding=utf-8" org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.extraJavaOptions=-Dfile.encoding=utf-8 -Dsun.jnu.encoding=utf-8" 
--class org.apache.spark.repl.Main --name "Spark shell" spark-shell
    ```
    
    #### Multi options
    
    ##### all options has no quotes
    
    ```
    bin\spark-shell --name spark-shell-fix --driver-memory 2g
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --name spark-shell-fix 
--driver-memory 2g
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx2g org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.memory=2g" --class org.apache.spark.repl.Main --name "Spark 
shell" --name spark-shell-fix spark-shell
    ```
    
    ##### some options has no quotes
    
    ```
    bin\spark-shell --name "spark shell fix" --driver-memory 2g
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --name "spark shell fix" 
--driver-memory 2g
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx2g org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.memory=2g" --class org.apache.spark.repl.Main --name "Spark 
shell" --name "spark shell fix" spark-shell
    ```
    
    ##### all options has quotes
    - all options quotes one parameter
    ```
    bin\spark-shell --name "spark shell fix" --driver-java-options 
"-Dfile.encoding=utf-8"
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --name "spark shell fix" 
--driver-java-options "-Dfile.encoding=utf-8"
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g "-Dfile.encoding=utf-8" 
org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.extraJavaOptions=-Dfile.encoding=utf-8" --class 
org.apache.spark.repl.Main --name "Spark shell" --name "spark shell fix" 
spark-shell
    ```
    
    - some options quotes multi parameters
    
    ```
    bin\spark-shell --driver-java-options "-Dfile.encoding=utf-8 
-Dsun.jnu.encoding=utf-8" --name "spark shell fix"
    
    "C:\jdk1.8.0_65\bin\java" -Xmx128m -cp 
""c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars"\*" 
org.apache.spark.launcher.Main org.apache.spark.deploy.SparkSubmit --class 
org.apache.spark.repl.Main --name "Spark shell" --driver-java-options 
"-Dfile.encoding=utf-8 -Dsun.jnu.encoding=utf-8" --name "spark shell fix"
    
    C:\jdk1.8.0_65\bin\java -cp 
"c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\conf\;c:\spark-pr-19090\spark-2.2.0-bin-hadoop2.7-fix\bin\..\jars\*"
 "-Dscala.usejavacp=true" -Xmx1g "-Dfile.encoding=utf-8" 
"-Dsun.jnu.encoding=utf-8" org.apache.spark.deploy.SparkSubmit --conf 
"spark.driver.extraJavaOptions=-Dfile.encoding=utf-8 -Dsun.jnu.encoding=utf-8" 
--class org.apache.spark.repl.Main --name "Spark shell" --name "spark shell 
fix" spark-shell
    ```
    
    ## Examples in Spark document
    
    
    ```
    bin\run-example.cmd SparkPi 10
    
    bin\spark-shell --master local[2]
    
    bin\pyspark --master local[2]
    
    bin\spark-submit examples\src\main\python\pi.py 10
    
    bin\sparkR --master local[2]
    
    bin\spark-submit examples\src\main\r\dataframe.R
    
    bin\spark-shell
       val textFile = spark.read.textFile("README.md")
       textFile.count()
    
    bin\spark-shell --master local[4] --jars 
C:\Users\meng\.ivy2\jars\com.databricks_spark-avro_2.11-3.2.0.jar
    
    bin\spark-submit --class org.apache.spark.examples.SparkPi --master 
local[8] 
C:\Users\meng\IdeaProjects\spark\examples\target\original-spark-examples_2.11-2.2.0.jar
 100
    ```
     
    @HyukjinKwon  According to the result of these test cases, I think this PR 
can works well in different situations, anything else should be tested ?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to