zhengruifeng commented on code in PR #38395:
URL: https://github.com/apache/spark/pull/38395#discussion_r1010072313


##########
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##########
@@ -2101,3 +2101,53 @@ object AsOfJoin {
     }
   }
 }
+
+
+/**
+ * A logical plan for summary.
+ */
+case class UnresolvedSummary(
+    child: LogicalPlan,
+    statistics: Seq[String]) extends UnaryNode {
+
+  private lazy val supported =
+    Set("count", "count_distinct", "approx_count_distinct", "mean", "stddev", 
"min", "max")
+
+  {
+    // TODO: throw AnalysisException instead
+    require(statistics.nonEmpty)

Review Comment:
   since there are several UT checking the thrown exception, so I let it alone 
to keep the behavior.
   I think we can define new error classes in separate PRs



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to