Yicong-Huang commented on code in PR #54172:
URL: https://github.com/apache/spark/pull/54172#discussion_r2795612492
##########
python/pyspark/worker.py:
##########
@@ -217,6 +217,46 @@ def chain(f, g):
return lambda *a: g(f(*a))
+def verify_result(expected_type: type) -> Callable[[Any], Iterator]:
Review Comment:
Good question! The `verify_result` helper extracts the **common validation
pattern** (iterator check + element type check) that appears repeatedly in many
`wrap_` functions.
However, it's **not intended to replace all `wrap_` functions entirely**,
because those functions have additional responsibilities beyond basic type
validation. The `verify_result` helper is general enough to validate result to
be iterators of a type, though.
##########
python/pyspark/worker.py:
##########
@@ -2819,25 +2824,48 @@ def read_udfs(pickleSer, infile, eval_type,
runner_conf, eval_conf):
for i in range(num_udfs)
]
+ if eval_type == PythonEvalType.SQL_MAP_ARROW_ITER_UDF:
+ import pyarrow as pa
+
+ assert num_udfs == 1, "One MAP_ARROW_ITER UDF expected here."
+ udf_func = udfs[0]
Review Comment:
added!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]