It looks like either the extracted Python code is corrupted or there is a mismatch Python version. Are you using Python 3?
stackoverflow.com/questions/514371/whats-the-bad-magic-number-error<http://stackoverflow.com/questions/514371/whats-the-bad-magic-number-error> On Mon, Jul 4, 2016 at 1:37 AM -0700, "Yanbo Liang" <yblia...@gmail.com<mailto:yblia...@gmail.com>> wrote: Hi Arun, The command bin/pyspark --packages graphframes:graphframes:0.1.0-spark1.6 will automatically load the required graphframes jar file from maven repository, it was not affected by the location where the jar file was placed. Your examples works well in my laptop. Or you can use try with bin/pyspark --py-files ***/graphframes.jar --jars ***/graphframes.jar to launch PySpark with graphframes enabled. You should set "--py-files" and "--jars" options with the directory where you saved graphframes.jar. Thanks Yanbo 2016-07-03 15:48 GMT-07:00 Arun Patel <arunp.bigd...@gmail.com<mailto:arunp.bigd...@gmail.com>>: I started my pyspark shell with command (I am using spark 1.6). bin/pyspark --packages graphframes:graphframes:0.1.0-spark1.6 I have copied http://dl.bintray.com/spark-packages/maven/graphframes/graphframes/0.1.0-spark1.6/graphframes-0.1.0-spark1.6.jar to the lib directory of Spark as well. I was getting below error >>> from graphframes import * Traceback (most recent call last): File "<stdin>", line 1, in <module> zipimport.ZipImportError: can't find module 'graphframes' >>> So, as per suggestions from similar questions, I have extracted the graphframes python directory and copied to the local directory where I am running pyspark. >>> from graphframes import * But, not able to create the GraphFrame >>> g = GraphFrame(v, e) Traceback (most recent call last): File "<stdin>", line 1, in <module> NameError: name 'GraphFrame' is not defined Also, I am getting below error. >>> from graphframes.examples import Graphs Traceback (most recent call last): File "<stdin>", line 1, in <module> ImportError: Bad magic number in graphframes/examples.pyc Any help will be highly appreciated. - Arun