Thanks Prabhu,
I tried starting in local mode but still picking Python 2.6 only. I have 
exported “DEFAULT_PYTHON” in my session variable and also included in PATH.

Export:
export DEFAULT_PYTHON="/home/stuti/Python/bin/python2.7"
export PATH="/home/stuti/Python/bin/python2.7:$PATH

$ pyspark --master local
Python 2.6.6 (r266:84292, Jul 23 2015, 15:22:56)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-11)] on linux2
….

Thanks &Regards
Stuti Awasthi

From: Prabhu Joseph [mailto:prabhujose.ga...@gmail.com]
Sent: Tuesday, March 15, 2016 2:22 PM
To: Stuti Awasthi
Cc: user@spark.apache.org
Subject: Re: Launch Spark shell using differnt python version

Hi Stuti,
     You can try local mode but not spark master or yarn mode if python-2.7 is 
not installed on all Spark Worker / NodeManager machines. To run with master 
mode

   1. Check whether user is able to access python2.7
   2. Check if you have installed python-2.7 in all NodeManager machines / 
Spark Worker machines and restarted

  Executor running inside Spark Worker is able to get the full path of 
python2.7. But inside NodeManager, executor does not find the python2.7 even 
though the script is in PATH. To make NodeManager find the path, set the full 
path of python-2.7 like below in pyspark script.

   DEFAULT_PYTHON="/ANACONDA/anaconda2/bin/python2.7"
Thanks,
Prabhu Joseph


On Tue, Mar 15, 2016 at 11:52 AM, Stuti Awasthi 
<stutiawas...@hcl.com<mailto:stutiawas...@hcl.com>> wrote:
Hi All,

I have a Centos cluster (without any sudo permissions) which has by default 
Python 2.6. Now I have installed Python2.7 for my user account and did the 
changes in bashrc so that Python2.7 is picked up by default. Then I have set 
the following properties in bashrc inorder to launch spark shell using Python 
2.7 but its not working.

Bashrc details :
alias python='/home/stuti/Python/bin/python2.7'
alias python2='/home/stuti/Python/bin/python2.7'
export PYSPARK_PYTHON=/home/stuti/Python/bin/python2.7
export LD_LIBRARY_PATH=/home/stuti/Python/lib:$LD_LIBRARY_PATH
export PATH=$HOME/bin:$PATH

Also it is to be noted that Spark cluster is configured with different user 
account and I have not installed python2.7 on all the nodes in the cluster as I 
don’t have permission access.
So is there any way that I can launch my spark shell using Python2.7.
Please suggest

Thanks &Regards
Stuti Awasthi



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended 
for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information 
could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in 
transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on 
the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the 
author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, 
dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written 
consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please 
delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and 
other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

Reply via email to