GitHub user HyukjinKwon opened a pull request:

    https://github.com/apache/spark/pull/16429

    [WIP][SPARK-19019][PYTHON] Fix hijected collections.namedtuple to be 
serialized with keyword-only arguments

    ## What changes were proposed in this pull request?
    
    Currently, PySpark does not work with Python 3.6.0.
    
    Running `./bin/pyspark` simply throws the error as below:
    
    ```
    Traceback (most recent call last):
      File ".../spark/python/pyspark/shell.py", line 30, in <module>
        import pyspark
      File ".../spark/python/pyspark/__init__.py", line 46, in <module>
        from pyspark.context import SparkContext
      File ".../spark/python/pyspark/context.py", line 36, in <module>
        from pyspark.java_gateway import launch_gateway
      File ".../spark/python/pyspark/java_gateway.py", line 31, in <module>
        from py4j.java_gateway import java_import, JavaGateway, GatewayClient
      File "<frozen importlib._bootstrap>", line 961, in _find_and_load
      File "<frozen importlib._bootstrap>", line 950, in _find_and_load_unlocked
      File "<frozen importlib._bootstrap>", line 646, in _load_unlocked
      File "<frozen importlib._bootstrap>", line 616, in 
_load_backward_compatible
      File ".../spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", 
line 18, in <module>
      File 
"/usr/local/Cellar/python3/3.6.0/Frameworks/Python.framework/Versions/3.6/lib/python3.6/pydoc.py",
 line 62, in <module>
        import pkgutil
      File 
"/usr/local/Cellar/python3/3.6.0/Frameworks/Python.framework/Versions/3.6/lib/python3.6/pkgutil.py",
 line 22, in <module>
        ModuleInfo = namedtuple('ModuleInfo', 'module_finder name ispkg')
      File ".../spark/python/pyspark/serializers.py", line 394, in namedtuple
        cls = _old_namedtuple(*args, **kwargs)
    TypeError: namedtuple() missing 3 required keyword-only arguments: 
'verbose', 'rename', and 'module'
    ```
    
    The root cause seems because the arguments of `namedtuple` are now 
completely keyword-only arguments from Python 3.6.0 (See 
https://bugs.python.org/issue25628).
    
    We currently copy this function via `types.FunctionType` which does not set 
the default values of keyword-only arguments (meaning 
`namedtuple.__kwdefaults__`) and this seems causing internally missing values 
in the function (non-bound arguments).
    
    This PR proposes to work around this by manually setting it via `kwargs` as 
`types.FunctionType` seems not supporting to set this.
    
    ## How was this patch tested?
    
    Manually tested with Python 3.6.0.
    
    ```
    ./bin/pyspsark
    ```

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/HyukjinKwon/spark SPARK-19019

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/16429.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #16429
    
----
commit fb049790b5f96070ebd1006630e24bf20c20319a
Author: hyukjinkwon <[email protected]>
Date:   2016-12-29T02:42:28Z

    Fix naedtuple can be serialized with keyword-only arguments too

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to