douglasdennis commented on code in PR #693:
URL: https://github.com/apache/incubator-sedona/pull/693#discussion_r979275270


##########
python/sedona/sql/dataframe_api.py:
##########
@@ -0,0 +1,35 @@
+from typing import Any, Tuple, Union, Iterable
+from pyspark.sql import SparkSession, Column, functions as f
+
+
+ColumnOrName = Union[Column, str]
+ColumnOrNameOrNumber = Union[Column, str, float, int]
+
+
+def _convert_argument_to_java_column(arg: Any) -> Column:
+    if isinstance(arg, Column):
+        return arg._jc
+    elif isinstance(arg, str):
+        return f.col(arg)._jc
+    elif isinstance(arg, Iterable):
+        return f.array(*[Column(x) for x in 
map(_convert_argument_to_java_column, arg)])._jc

Review Comment:
   No advantage. Just me being silly when I refactored. I originally wanted 
this to use recursion in a different way, but I wasn't feeling it so I 
defaulted to this out of haste :) Will refactor to use function composition. 
Thanks for catching that. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to