HyukjinKwon commented on a change in pull request #31207:
URL: https://github.com/apache/spark/pull/31207#discussion_r565030192
##########
File path: python/pyspark/sql/functions.py
##########
@@ -91,13 +92,48 @@ def lit(col):
Creates a :class:`Column` of literal value.
.. versionadded:: 1.3.0
+ .. versionchanged:: 3.2.0
+ Added support for complex type literals.
Examples
--------
>>> df.select(lit(5).alias('height')).withColumn('spark_user',
lit(True)).take(1)
[Row(height=5, spark_user=True)]
- """
- return col if isinstance(col, Column) else _invoke_function("lit", col)
+ >>> df.select(
+ ... lit({"height": 5}).alias("data"),
+ ... lit(["python", "scala"]).alias("languages")
+ ... ).take(1)
+ [Row(data={'height': 5}, languages=['python', 'scala'])]
+ """
+ if isinstance(col, Column):
+ return col
+
+ elif isinstance(col, list):
+ return array(*[lit(x) for x in col])
+
+ elif isinstance(col, tuple):
+ fields = (
+ # Named tuple
+ col._fields if hasattr(col, "_fields")
+ # PySpark Row
+ else col.__fields__ if hasattr(col, "__fields__")
+ # Other
+ else [f"_{i + 1}" for i in range(len(col))]
+ )
+
+ return struct(*[
Review comment:
Hm, okay but these will be literals after constant folding though.
@cloud-fan, WDYT about supporting complex type at literals in this way? I think
we can do similar thing in Scala side as well to allow map and struct literals.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]