PySpark requires Python 2.6/7. On Wed, Nov 5, 2014 at 5:32 PM, Pagliari, Roberto <[email protected]> wrote: > I'm not on the cluster now so I cannot check. What is the minimum requirement > for Python? > > Thanks, > > ________________________________________ > From: Davies Liu [[email protected]] > Sent: Wednesday, November 05, 2014 7:41 PM > To: Pagliari, Roberto > Cc: [email protected] > Subject: Re: SparkContext._lock Error > > What's the version of Python? 2.4? > > Davies > > On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto > <[email protected]> wrote: >> I’m using this system >> >> >> >> Hadoop 1.0.4 >> >> Scala 2.9.3 >> >> Hive 0.9.0 >> >> >> >> >> >> With spark 1.1.0. When importing pyspark, I’m getting this error: >> >> >> >>>>> from pyspark.sql import * >> >> Traceback (most recent call last): >> >> File "<stdin>", line 1, in ? >> >> File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ? >> >> from pyspark.context import SparkContext >> >> File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209 >> >> with SparkContext._lock: >> >> ^ >> >> SyntaxError: invalid syntax >> >> >> >> How do I fix it? >> >> >> >> Thank you,
--------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
