HyukjinKwon commented on a change in pull request #31207:
URL: https://github.com/apache/spark/pull/31207#discussion_r565027496



##########
File path: python/pyspark/sql/functions.py
##########
@@ -91,13 +92,48 @@ def lit(col):
     Creates a :class:`Column` of literal value.
 
     .. versionadded:: 1.3.0
+    .. versionchanged:: 3.2.0
+        Added support for complex type literals.
 
     Examples
     --------
     >>> df.select(lit(5).alias('height')).withColumn('spark_user', 
lit(True)).take(1)
     [Row(height=5, spark_user=True)]
-    """
-    return col if isinstance(col, Column) else _invoke_function("lit", col)
+    >>> df.select(
+    ...     lit({"height": 5}).alias("data"),
+    ...     lit(["python", "scala"]).alias("languages")
+    ... ).take(1)
+    [Row(data={'height': 5}, languages=['python', 'scala'])]
+    """
+    if isinstance(col, Column):
+        return col
+
+    elif isinstance(col, list):
+        return array(*[lit(x) for x in col])
+
+    elif isinstance(col, tuple):
+        fields = (
+            # Named tuple
+            col._fields if hasattr(col, "_fields")
+            # PySpark Row
+            else col.__fields__ if hasattr(col, "__fields__")
+            # Other
+            else [f"_{i + 1}" for i in range(len(col))]
+        )
+
+        return struct(*[

Review comment:
       One thing that makes me worry is this isn't technically a literal ... 
although it's foldable, the results and optimization might be differently 
applied compared to Scala side .. also looks like Scala side doesn't support 
map and struct (due to both the limitation of typing and the fact that `Row` 
doesn't hold field names, etc. like PySpark).




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to