cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r891397088


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##########
@@ -1227,6 +1227,49 @@ case class Pivot(
   override protected def withNewChildInternal(newChild: LogicalPlan): Pivot = 
copy(child = newChild)
 }
 
+/**
+ * A constructor for creating a melt, which will later be converted to a 
[[Expand]]
+ * during the query analysis.
+ *
+ * An empty values array will be replaced during analysis with all resolved 
outputs of child except
+ * the ids. This expansion allows to easily melt all non-id columns.
+ *
+ * The type of the value column is derived from all value columns during 
analysis once all values
+ * are resolved. All values' types have to be compatible, otherwise the result 
value column cannot
+ * be assigned the individual values and an AnalysisException is thrown.
+ *
+ * @see `org.apache.spark.sql.catalyst.analysis.TypeCoercionBase.MeltCoercion`
+ *
+ * @param ids                Id columns
+ * @param values             Value columns to melt
+ * @param variableColumnName Name of the variable column
+ * @param valueColumnName    Name of the value column
+ * @param valueType          Type of value column once known
+ * @param child              Child operator
+ */
+case class Melt(
+    ids: Seq[NamedExpression],
+    values: Seq[NamedExpression],

Review Comment:
   I think id and value columns can simply be `Expression` instead of 
`NamedExpression`. In the analyzer rule, we can fail if the id and value 
columns end up with normal expressions, e.g. `df.melt(Array($"id" + 1), ...)`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to