nchammas commented on code in PR #45363:
URL: https://github.com/apache/spark/pull/45363#discussion_r1510094130
##########
sql/core/src/main/scala/org/apache/spark/sql/RelationalGroupedDataset.scala:
##########
@@ -324,18 +324,18 @@ class RelationalGroupedDataset protected[sql](
/**
* Pivots a column of the current `DataFrame` and performs the specified
aggregation.
*
- * There are two versions of `pivot` function: one that requires the caller
to specify the list
- * of distinct values to pivot on, and one that does not. The latter is more
concise but less
- * efficient, because Spark needs to first compute the list of distinct
values internally.
- *
* {{{
* // Compute the sum of earnings for each year by course with each course
as a separate column
- * df.groupBy("year").pivot("course", Seq("dotNET",
"Java")).sum("earnings")
- *
- * // Or without specifying column values (less efficient)
* df.groupBy("year").pivot("course").sum("earnings")
* }}}
*
+ * @note Spark will '''eagerly''' compute the distinct values in
`pivotColumn` so it can determine
+ * the resulting schema of the transformation. Depending on the size and
complexity of your
+ * data, this may take some time. In other words, though the pivot
transformation is lazy like
+ * most DataFrame transformations, computing the distinct pivot values is
not. To avoid any
+ * eager computations, provide an explicit list of values via
+ * `pivot(pivotColumn: String, values: Seq[Any])`.
Review Comment:
I probably spent about an hour trying to get this to work as a proper link
via `[[pivot(...]]`, per [the scaladoc docs on ambiguous links][1], but I could
not get it to work.
[1]:
https://docs.scala-lang.org/overviews/scaladoc/for-library-authors.html#resolving-ambiguous-links-within-scaladoc-comments
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]