I have been searching in stackoverflow and other places for the error I am
seeing now and tried a few "answers", none is working here (I will continue
search though and update here):

I have a new Ubuntu and Anaconda3 is installed, Spark 2 is installed:

Anaconda3: /home/rxie/anaconda Spark2: /home/rxie/Downloads/spark

I am able to start up Jupyter Notebook, however, not able to create
SparkSession:

from pyspark.conf import SparkConf

ModuleNotFoundError Traceback (most recent call last) in () ----> 1 from
pyspark.conf import SparkConf

ModuleNotFoundError: No module named 'pyspark'

Here is my environments in the .bashrc:

export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export SPARK_HOME=/home/rxie/spark/
export SBT_HOME=/usr/share/sbt/bin/sbt-launch.jar
export SCALA_HOME=/usr/local/src/scala/scala-2.10.4
export PATH=$SCALA_HOME/bin:$PATH
export PATH=$SPARK_HOME/bin:$PATH
export PATH=$PATH:$SBT_HOME/bin:$SPARK_HOME/bin

# added by Anaconda3 installer
export PATH="/home/rxie/anaconda3/bin:$PATH"
export PATH=$SPARK_HOME/bin:$PATH
export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS='notebook'

What's wrong with the import SparkConf in jupyter notebook?

It is greatly appreciated if anyone can shed me with any light, thank you
very much.


*------------------------------------------------*
*Sincerely yours,*


*Raymond*

-- 
You received this message because you are subscribed to the Google Groups 
"Project Jupyter" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/jupyter/CAKZNNLL_bVe24JFxZrSk3AmpueSBpx2VngN%3DVBBjYFoAdb9N8w%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to