GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/18682

    [MINOR][DOCS] Fix some missing notes for Python 2.6 support drop

    ## What changes were proposed in this pull request?
    
    After SPARK-12661, I guess we officially dropped Python 2.6 support. It 
looks there are few places missing this notes.
    
    I grepped "Python 2.6" and "python 2.6" and the results were below:
    
    ```
    ./core/src/main/scala/org/apache/spark/api/python/SerDeUtil.scala:  // 
Unpickle array.array generated by Python 2.6
    ./docs/index.md:Note that support for Java 7, Python 2.6 and old Hadoop 
versions before 2.6.5 were removed as of Spark 2.2.0.
    ./docs/rdd-programming-guide.md:Spark {{site.SPARK_VERSION}} works with 
Python 2.6+ or Python 3.4+. It can use the standard CPython interpreter,
    ./docs/rdd-programming-guide.md:Note that support for Python 2.6 is 
deprecated as of Spark 2.0.0, and may be removed in Spark 2.2.0.
    ./python/pyspark/context.py:            warnings.warn("Support for Python 
2.6 is deprecated as of Spark 2.0.0")
    ./python/pyspark/ml/tests.py:        sys.stderr.write('Please install 
unittest2 to test with Python 2.6 or earlier')
    ./python/pyspark/mllib/tests.py:        sys.stderr.write('Please install 
unittest2 to test with Python 2.6 or earlier')
    ./python/pyspark/serializers.py:        # On Python 2.6, we can't write 
bytearrays to streams, so we need to convert them
    ./python/pyspark/sql/tests.py:        sys.stderr.write('Please install 
unittest2 to test with Python 2.6 or earlier')
    ./python/pyspark/streaming/tests.py:        sys.stderr.write('Please 
install unittest2 to test with Python 2.6 or earlier')
    ./python/pyspark/tests.py:        sys.stderr.write('Please install 
unittest2 to test with Python 2.6 or earlier')
    ./python/pyspark/tests.py:        # NOTE: dict is used instead of 
collections.Counter for Python 2.6
    ./python/pyspark/tests.py:        # NOTE: dict is used instead of 
collections.Counter for Python 2.6
    ```
    
    This PR only proposes to change visible changes as below:
    
    ```
    ./docs/rdd-programming-guide.md:Spark {{site.SPARK_VERSION}} works with 
Python 2.6+ or Python 3.4+. It can use the standard CPython interpreter,
    ./docs/rdd-programming-guide.md:Note that support for Python 2.6 is 
deprecated as of Spark 2.0.0, and may be removed in Spark 2.2.0.
    ./python/pyspark/context.py:            warnings.warn("Support for Python 
2.6 is deprecated as of Spark 2.0.0")
    ```
    
    This one is already correct:
    
    ```
    ./docs/index.md:Note that support for Java 7, Python 2.6 and old Hadoop 
versions before 2.6.5 were removed as of Spark 2.2.0.
    ```
    
    ## How was this patch tested?
    
    ```bash
     grep -r "Python 2.6" .
     grep -r "python 2.6" .
     ```
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark minor-python.26

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/18682.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #18682
    
----
commit 95e9d235a5893f9cc0d7e3b679f8e19ceacd607e
Author: hyukjinkwon <[email protected]>
Date:   2017-07-19T14:25:17Z

    Fix some missing notes for Python 2.6 support drop

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to