PySpark requires Python 2.6/7.

On Wed, Nov 5, 2014 at 5:32 PM, Pagliari, Roberto
<rpagli...@appcomsci.com> wrote:
> I'm not on the cluster now so I cannot check. What is the minimum requirement 
> for Python?
>
> Thanks,
>
> ________________________________________
> From: Davies Liu [dav...@databricks.com]
> Sent: Wednesday, November 05, 2014 7:41 PM
> To: Pagliari, Roberto
> Cc: user@spark.apache.org
> Subject: Re: SparkContext._lock Error
>
> What's the version of Python? 2.4?
>
> Davies
>
> On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
> <rpagli...@appcomsci.com> wrote:
>> I’m using this system
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Scala 2.9.3
>>
>> Hive 0.9.0
>>
>>
>>
>>
>>
>> With spark 1.1.0. When importing pyspark, I’m getting this error:
>>
>>
>>
>>>>> from pyspark.sql import *
>>
>> Traceback (most recent call last):
>>
>>   File "<stdin>", line 1, in ?
>>
>>   File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>>
>>     from pyspark.context import SparkContext
>>
>>   File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209
>>
>>     with SparkContext._lock:
>>
>>                     ^
>>
>> SyntaxError: invalid syntax
>>
>>
>>
>> How do I fix it?
>>
>>
>>
>> Thank you,

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to