Please see
http://stackoverflow.com/questions/36397136/importing-pyspark-packages
On Mon, Apr 25, 2016 at 2:39 AM -0700, "Camelia Elena Ciolac"
wrote:
Hello,
I work locally on my laptop, not using DataBricks Community edition.
I downloaded graphframes-0.1.0-spark1.6.jar from
http://spark-packages.org/package/graphframes/graphframes
and placed it in a folder named spark_extra_jars where I have other jars too.
After executing in a terminal:
ipython notebook --profile = nbserver
I open in the browser http://127.0.0.1:/ and in my IPython notebook I have,
among others :
jar_path =
'/home/camelia/spark_extra_jars/spark-csv_2.11-1.2.0.jar,/home/camelia/spark_extra_jars/commons-csv-1.2.jar,/home/camelia/spark_extra_jars/graphframes-0.1.0-spark1.6.jar,/home/camelia/spark_extra_jars/spark-mongodb_2.10-0.11.0.jar'
config =
SparkConf().setAppName("graph_analytics").setMaster("local[4]").set("spark.jars",
jar_path)
I can successfully import the other modules, but when I do
import graphframes
It gives the error:
ImportError Traceback (most recent call last)
in ()
> 1 import graphframes
ImportError: No module named graphframes
Thank you in advance for any hint on how to import graphframes successfully.
Best regards,
Camelia