Github user mgaido91 commented on a diff in the pull request:
https://github.com/apache/spark/pull/23258#discussion_r240003036
--- Diff:
sql/core/src/test/scala/org/apache/spark/sql/execution/metric/SQLMetricsSuite.scala
---
@@ -182,10 +182,13 @@ class SQLMetricsSuite extends SparkFunSuite with
SQLMetricsTestUtils with Shared
}
test("Sort metrics") {
- // Assume the execution plan is
- // WholeStageCodegen(nodeId = 0, Range(nodeId = 2) -> Sort(nodeId = 1))
- val ds = spark.range(10).sort('id)
- testSparkPlanMetrics(ds.toDF(), 2, Map.empty)
+ // Assume the execution plan with node id is
+ // Sort(nodeId = 0)
+ // Exchange(nodeId = 1)
+ // LocalTableScan(nodeId = 2)
+ val df = Seq(1, 3, 2).toDF("id").sort('id)
+ testSparkPlanMetrics(df, 2, Map.empty)
--- End diff --
Thanks for pinging me @maropu. What is the point about checking that
`LocalTableScan` contains no metrics?
I checked the original PR which introduced this UT by @sameeragarwal who
can maybe help us stating the goal of the test here (unless someone else can
answer me, because I have not understood it). It doesn't seem even related to
the Sort operator to me. Maybe I am missing something.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]