Hi - this seems to be an issue with the way the python code is imported from a 
jar or from a Spark package. I ran into the same. I tried bt couldn't find any 
guideline on how a Spark package should make its Python binding available. If 
you would open an issue at graph frame, I could chime in there.



    _____________________________
From: enzo <e...@smartinsightsfromdata.com>
Sent: Friday, March 4, 2016 4:31 PM
Subject: graphframes: errors adding dependencies with pyspark
To:  <users@zeppelin.incubator.apache.org>


          Experimenting with new package graphframes for Spark.  Kindly tell me 
what do I do wrong.       
       Here is my code:       
           %dep          // sort dependencies          
z.load("graphframes:graphframes:0.1.0-spark1.6")          
          This gives (so annoying the reference to %dep been deprecated!):      
    
               DepInterpreter(%dep) deprecated. Load dependency through GUI 
interpreter menu instead.             res0: org.apache.zeppelin.dep.Dependency 
= org.apache.zeppelin.dep.Dependency@7bf2fe37              
           %pyspark          from graphframes import *          
       I get:       
                Traceback (most recent call last):              File 
"/var/folders/m4/yfyjwcpj6rv5nq730bl9vr1h0000gp/T//zeppelin_pyspark.py", line 
232, in <module>              eval(compiledCode)              File "<string>", 
line 1, in <module>             ImportError: No module named 'graphframes'      
        
       
       What do I do wrong?       
       Thanks       
       
    
          Enzo          e...@smartinsightsfromdata.com          
      
      
 


  

Reply via email to