Github user jsnowacki commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19370#discussion_r141600476
  
    --- Diff: bin/run-example.cmd ---
    @@ -17,6 +17,13 @@ rem See the License for the specific language governing 
permissions and
     rem limitations under the License.
     rem
     
    -set SPARK_HOME=%~dp0..
    +rem Figure out where the Spark framework is installed
    +set FIND_SPARK_HOME_SCRIPT=%~dp0find_spark_home.py
    +if exist "%FIND_SPARK_HOME_SCRIPT%" (
    +  for /f %%i in ('python %FIND_SPARK_HOME_SCRIPT%') do set SPARK_HOME=%%i
    --- End diff --
    
    The assumption was that if `find_spark_home.py` can be found in the local 
folder, this will be a good setup. While I think `find_spark_home.py` with any 
Python version (well, at least modern 2 and 3), indeed this not takes into 
account case that the python is just not there at all.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to