Hello everyone, I have a newbie question.
$SPARK_HOME/bin/pyspark will create SparkContext automatically. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /__ / .__/\_,_/_/ /_/\_\ version 1.4.1 /_/ Using Python version 2.7.3 (default, Jun 22 2015 19:33:41) SparkContext available as sc, HiveContext available as sqlContext. But When using ipython as a driver, PYSPARK_DRIVER_PYTHON="ipython" spark/bin/pyspark , does not create SparkContext automatically. I have to execute execfile('spark_home/python/pyspark/shell.py') is it by design? I read the bash script bin/pyspark, I noticed the line: export PYTHONSTARTUP="$SPARK_HOME/python/pyspark/shell.py" But I searched the whole spark source code, the variable PYTHONSTARTUP is never used, I could not understand when PYTHONSTARTUP is executed. Thank you.