Spark 2.0 is dropping the support for Python 2.6, it only work with
Python 2.7, and 3.4+

On Thu, Mar 10, 2016 at 11:17 PM, Gayathri Murali
<gayathri.m.sof...@gmail.com> wrote:
> Hi all,
>
> I am trying to run python unit tests.
>
> I currently have Python 2.6 and 2.7 installed. I installed unittest2 against 
> both of them.
>
> When I try to run /python/run-tests with Python 2.7 I get the following error 
> :
>
> Please install unittest2 to test with Python 2.6 or earlier
> Had test failures in pyspark.sql.tests with python2.6; see logs.
>
> When I try to run /python/run-tests with Python 2.6 I get the following error:
>
> Traceback (most recent call last):
>   File "./python/run-tests.py", line 42, in <module>
>     from sparktestsupport.modules import all_modules  # noqa
>   File "/Users/gayathri/spark/python/../dev/sparktestsupport/modules.py", 
> line 18, in <module>
>     from functools import total_ordering
> ImportError: cannot import name total_ordering
>
> total_ordering is a package that is available in 2.7.
>
> Can someone help?
>
> Thanks
> Gayathri
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to