[ 
https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15420995#comment-15420995
 ] 

Michael Berman commented on SPARK-16781:
----------------------------------------

In 0.10.3, py4j introduced an option to use the java from JAVA_HOME instead of 
just launching a bare {{java}} command. So one thing PySpark could do to help 
with this situation would be to update to that version, and then pass 
{{java_path=None}} when launching the gateway.

> java launched by PySpark as gateway may not be the same java used in the 
> spark environment
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16781
>                 URL: https://issues.apache.org/jira/browse/SPARK-16781
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.6.2
>            Reporter: Michael Berman
>
> When launching spark on a system with multiple javas installed, there are a 
> few options for choosing which JRE to use, setting `JAVA_HOME` being the most 
> straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always 
> invokes `java` directly, without qualification. This means you get whatever 
> java's first on your path, which is not necessarily the same one in spark's 
> JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is 
> easy: make sure the java you want is first on your path. I can't figure out a 
> way to make that reliably happen through the pyspark executor launch path, 
> and it seems like something that would ideally happen automatically. If I set 
> JAVA_HOME when launching spark, I would expect that to be the only java used 
> throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to