[ https://issues.apache.org/jira/browse/SPARK-17737?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen updated SPARK-17737: ------------------------------ Flags: (was: Important) I suspect it's a problem with how you've got ipython set up to run pyspark. If pyspark by itself works, then that should confirm it. That wouldn't be a Spark issue per se. > cannot import name accumulators error > ------------------------------------- > > Key: SPARK-17737 > URL: https://issues.apache.org/jira/browse/SPARK-17737 > Project: Spark > Issue Type: Question > Components: PySpark > Environment: unix > python 2.7 > Reporter: Pruthveej Reddy Kasarla > > Hi I am trying to setup my sparkcontext using the below code > import sys > sys.path.append('/opt/cloudera/parcels/CDH/lib/spark/python/build') > sys.path.append('/opt/cloudera/parcels/CDH/lib/spark/python') > from pyspark import SparkConf, SparkContext > sconf = SparkConf() > sc = SparkContext(conf=sconf) > print sc > got below error > ImportError Traceback (most recent call last) > <ipython-input-21-65fcb5ca9b52> in <module>() > 2 sys.path.append('/opt/cloudera/parcels/CDH/lib/spark/python/build') > 3 sys.path.append('/opt/cloudera/parcels/CDH/lib/spark/python') > ----> 4 from pyspark import SparkConf, SparkContext > 5 sconf = SparkConf() > 6 sc = SparkContext(conf=sconf) > /opt/cloudera/parcels/CDH/lib/spark/python/pyspark/__init__.py in <module>() > 39 > 40 from pyspark.conf import SparkConf > ---> 41 from pyspark.context import SparkContext > 42 from pyspark.rdd import RDD > 43 from pyspark.files import SparkFiles > /opt/cloudera/parcels/CDH/lib/spark/python/pyspark/context.py in <module>() > 26 from tempfile import NamedTemporaryFile > 27 > ---> 28 from pyspark import accumulators > 29 from pyspark.accumulators import Accumulator > 30 from pyspark.broadcast import Broadcast > ImportError: cannot import name accumulators -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org