Hello everyone, Another newbie question.
PYSPARK_DRIVER_PYTHON=ipython ./bin/pyspark runs fine, (in $SPARK_HOME) Python 2.7.10 (default, Jul 3 2015, 01:26:20) Type "copyright", "credits" or "license" for more information. IPython 3.2.1 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object', use 'object??' for extra details. 15/07/27 17:16:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /__ / .__/\_,_/_/ /_/\_\ version 1.4.1 /_/ Using Python version 2.7.10 (default, Jul 3 2015 01:26:20) SparkContext available as sc, HiveContext available as sqlContext. But PYSPARK_DRIVER_PYTHON=ipython ./spark-1.4.1-bin-hadoop2.6/bin/pyspark will throw errors: Traceback (most recent call last): File "/usr/local/bin/ipython", line 7, in <module> from IPython import start_ipython File "/usr/local/lib/python2.7/site-packages/IPython/__init__.py", line 45, in <module> from .config.loader import Config File "/usr/local/lib/python2.7/site-packages/IPython/config/__init__.py", line 6, in <module> from .application import * File "/usr/local/lib/python2.7/site-packages/IPython/config/application.py", line 19, in <module> from IPython.config.configurable import SingletonConfigurable File "/usr/local/lib/python2.7/site-packages/IPython/config/configurable.py", line 12, in <module> from .loader import Config, LazyConfigValue File "/usr/local/lib/python2.7/site-packages/IPython/config/loader.py", line 14, in <module> from ast import literal_eval ImportError: cannot import name literal_eval Note: running `from ast import literal_eval` within ipython is successful. The only different is that I run the command in the SPARK_HOME directory or not. What causes the problem? Or something is wrong with the compiled python and ipython? Thank you very much.