[ 
https://issues.apache.org/jira/browse/SPARK-13539?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hossein Vatani updated SPARK-13539:
-----------------------------------
    Description: 
Hi,
I faced below exception when I tried to run 
http://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=filter#pyspark.sql.SQLContext.jsonRDD
 samples:
 Exception: Python in worker has different version 2.7 than that in driver 3.5, 
PySpark cannot run with different minor versions
my OS is : CentOS 7 and I installed anaconda3,also I have to keep python2.7 for 
some another application, run Spark with:
PYSPARK_DRIVER_PYTHON=ipython3 pyspark
I have not any config regarding "python" or "ipython" in my profile or 
spark-defualt.conf .
could you please assist me?

  was:
Hi,
I faced below exception when I tried to run 
http://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=filter#pyspark.sql.SQLContext.jsonRDD
 samples:
 Exception: Python in worker has different version 2.7 than that in driver 3.5, 
PySpark cannot run with different minor versions
my OS is : CentOS 7 and I installed anaconda3, run Spark with:
PYSPARK_DRIVER_PYTHON=ipython3 pyspark
I have not any config regarding "python" or "ipython" in my profile or 
spark-defualt.conf .
could you please assist me?


> Python in worker has different version 2.7 than that in driver 3.5, PySpark 
> cannot run with different minor versions
> --------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-13539
>                 URL: https://issues.apache.org/jira/browse/SPARK-13539
>             Project: Spark
>          Issue Type: Question
>          Components: PySpark
>    Affects Versions: 1.6.0
>            Reporter: Hossein Vatani
>            Priority: Minor
>              Labels: features
>             Fix For: 1.6.0
>
>
> Hi,
> I faced below exception when I tried to run 
> http://spark.apache.org/docs/latest/api/python/pyspark.sql.html?highlight=filter#pyspark.sql.SQLContext.jsonRDD
>  samples:
>  Exception: Python in worker has different version 2.7 than that in driver 
> 3.5, PySpark cannot run with different minor versions
> my OS is : CentOS 7 and I installed anaconda3,also I have to keep python2.7 
> for some another application, run Spark with:
> PYSPARK_DRIVER_PYTHON=ipython3 pyspark
> I have not any config regarding "python" or "ipython" in my profile or 
> spark-defualt.conf .
> could you please assist me?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to