Re: pyspark configuration with Juyter

2017-11-04 Thread Marco Mistroni
Hi probably not what u r looking for but if u get stuck with conda jupyther and spark, if u get an account @ community.cloudera you will enjoy jupyther and spark out of the box Gd luck and hth Kr On Nov 4, 2017 4:59 PM, "makoto" wrote: > I setup environment variables in

Re: pyspark configuration with Juyter

2017-11-04 Thread makoto
I setup environment variables in my ~/.bashrc as follows: export PYSPARK_PYTHON=/usr/local/oss/anaconda3/bin/python3.6 export PYTHONPATH=$(ls -a ${SPARK_HOME}/python/lib/py4j-*-src.zip):${SPARK_HOME}/python:$PYTHONPATH export PYSPARK_DRIVER_PYTHON=jupyter export

Re: pyspark configuration with Juyter

2017-11-03 Thread Jeff Zhang
You are setting PYSPARK_DRIVER to jupyter, please set it to python exec file anudeep 于2017年11月3日周五 下午7:31写道: > Hello experts, > > I install jupyter notebook thorugh anacoda, set the pyspark driver to use > jupyter notebook. > > I see the below issue when i try to open

pyspark configuration with Juyter

2017-11-03 Thread anudeep
Hello experts, I install jupyter notebook thorugh anacoda, set the pyspark driver to use jupyter notebook. I see the below issue when i try to open pyspark. anudeepg@datanode2 spark-2.1.0]$ ./bin/pyspark [I 07:29:53.184 NotebookApp] The port is already in use, trying another port. [I