Github user e-dorigatti commented on a diff in the pull request:
https://github.com/apache/spark/pull/21383#discussion_r191418719
--- Diff: python/pyspark/sql/udf.py ---
@@ -157,7 +157,17 @@ def _create_judf(self):
spark = SparkSession.builder.getOrCreate()
sc = spark.sparkContext
- wrapped_func = _wrap_function(sc, self.func, self.returnType)
+ func = fail_on_stopiteration(self.func)
+
+ # prevent inspect to fail
+ # e.g. inspect.getargspec(sum) raises
+ # TypeError: <built-in function sum> is not a Python function
+ try:
+ func._argspec = _get_argspec(self.func)
+ except TypeError:
--- End diff --
the only way to do it there is to pass another argument to
`UserDefinedFunction.__init__` with the wrapped function, because
`UserDefinedFunction._wrapped` needs the original function (for docstring, etc.)
so I think what you suggest is actually messier (but ofc I will do it if
you think it's the better way)
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]