[
https://issues.apache.org/jira/browse/SPARK-4974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14259258#comment-14259258
]
Matt Chapman commented on SPARK-4974:
-------------------------------------
First time contributor. Not sure if this is the right fix, but he's a
pull-request with the change that worked for me:
https://github.com/apache/spark/pull/3813
> Circular dependency in pyspark/context.py causes import failure.
> ----------------------------------------------------------------
>
> Key: SPARK-4974
> URL: https://issues.apache.org/jira/browse/SPARK-4974
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Environment: Python 2.7.8
> Ubuntu 14.10
> Reporter: Matt Chapman
>
> Steps to reproduce:
> 1. Run a python cli from the 'python/' directory. (Reproduced with default
> python cli and also ipython.)
> 2. Run this code:
> ```
> >>> import sys
> >>> sys.path.append('build/')
> >>> import pyspark.sql
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> File "pyspark/__init__.py", line 63, in <module>
> from pyspark.context import SparkContext
> File "pyspark/context.py", line 25, in <module>
> from pyspark import accumulators
> ImportError: cannot import name accumulators
> ```
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]