Github user BryanCutler commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20373#discussion_r163419942
  
    --- Diff: python/pyspark/cloudpickle.py ---
    @@ -318,6 +329,18 @@ def save_function(self, obj, name=None):
             Determines what kind of function obj is (e.g. lambda, defined at
             interactive prompt, etc) and handles the pickling appropriately.
             """
    +        if obj in _BUILTIN_TYPE_CONSTRUCTORS:
    +            # We keep a special-cased cache of built-in type constructors 
at
    +            # global scope, because these functions are structured very
    +            # differently in different python versions and implementations 
(for
    +            # example, they're instances of types.BuiltinFunctionType in
    +            # CPython, but they're ordinary types.FunctionType instances in
    +            # PyPy).
    +            #
    +            # If the function we've received is in that cache, we just
    +            # serialize it as a lookup into the cache.
    +            return self.save_reduce(_BUILTIN_TYPE_CONSTRUCTORS[obj], (), 
obj=obj)
    +
    --- End diff --
    
    BUG: Hit the builtin type cache for any function 
https://github.com/cloudpipe/cloudpickle/commit/d84980ccaafc7982a50d4e04064011f401f17d1b


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to