Github user HyukjinKwon commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19370#discussion_r141503492
  
    --- Diff: bin/pyspark2.cmd ---
    @@ -18,7 +18,12 @@ rem limitations under the License.
     rem
     
     rem Figure out where the Spark framework is installed
    -set SPARK_HOME=%~dp0..
    +set FIND_SPARK_HOME_SCRIPT=%~dp0find_spark_home.py
    +if exist "%FIND_SPARK_HOME_SCRIPT%" (
    +  for /f %%i in ('python %FIND_SPARK_HOME_SCRIPT%') do set SPARK_HOME=%%i
    --- End diff --
    
    Mind adding some comments? I believe we resemble here:
    
    
https://github.com/apache/spark/blob/9244957b500cb2b458c32db2c63293a1444690d7/bin/find-spark-home#L28-L40
    
    which detects `find_spark_home.py` that should be included in pip 
installation:
    
    
https://github.com/apache/spark/blob/aad2125475dcdeb4a0410392b6706511db17bac4/python/setup.py#L143-L145
    
    I'd be nicer if PR description explains this.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to