EnricoMi commented on code in PR #37407:
URL: https://github.com/apache/spark/pull/37407#discussion_r970587522


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##########
@@ -1378,28 +1378,84 @@ case class Pivot(
  * A constructor for creating an Unpivot, which will later be converted to an 
[[Expand]]
  * during the query analysis.
  *
- * An empty values array will be replaced during analysis with all resolved 
outputs of child except
+ * Either ids or values array must be set. The ids array can be empty,
+ * the values array must not be empty if not None.
+ *
+ * A None ids array will be replaced during analysis with all resolved outputs 
of child except
+ * the values. This expansion allows to easily select all non-value columns as 
id columns.
+ *
+ * A None values array will be replaced during analysis with all resolved 
outputs of child except

Review Comment:
   The Scala / Java API always sets ids.
   The Python API calls into the Scala API.
   The SQL API always sets values.
   
   There is no code path that creates `Unpivot` with both `ids` and `values` 
being None. That is why there is no rule in CheckAnalysis and no user facing 
error for this case.
   
   I could add an assert in Unpivot to handle this case and provide a useful 
error / comment in the code.
   
   The SQL API requires `Unpivot` to allow for `ids = None` as the SQL syntax 
does not allow to specify any ids columns. The select clause contains that 
information, but that is inaccessible when parsing the unpivot clause. So 
resolving ids is done in analysis phase, not in SQL parsing phase.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to