[ 
https://issues.apache.org/jira/browse/SPARK-4974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14259734#comment-14259734
 ] 

Josh Rosen commented on SPARK-4974:
-----------------------------------

I'm having trouble reproducing this issue:

{code}
[joshrosen ~]$ python
Python 2.7.8 |Anaconda 2.0.1 (x86_64)| (default, Aug 21 2014, 15:21:46)
[GCC 4.2.1 (Apple Inc. build 5577)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://binstar.org
>>> import sys
>>> sys.path.append("/Users/joshrosen/Documents/Spark/python")
>>> import pyspark.sql
{code}

Is there a chance that maybe you have a bunch of old {{.pyc}} files around that 
could be causing problems?  Can you provide instructions to reproduce this on a 
fresh Spark checkout?

> Circular dependency in pyspark/context.py causes import failure.
> ----------------------------------------------------------------
>
>                 Key: SPARK-4974
>                 URL: https://issues.apache.org/jira/browse/SPARK-4974
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>         Environment: Python 2.7.8
> Ubuntu 14.10
>            Reporter: Matt Chapman
>
> Steps to reproduce:
> 1. Run a python cli from the 'python/' directory. (Reproduced with default 
> python cli and also ipython.)
> 2. Run this code:
> {code}
> >>> import sys
> >>> sys.path.append('build/')
> >>> import pyspark.sql
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "pyspark/__init__.py", line 63, in <module>
>     from pyspark.context import SparkContext
>   File "pyspark/context.py", line 25, in <module>
>     from pyspark import accumulators
> ImportError: cannot import name accumulators
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to