andygrove commented on a change in pull request #31793:
URL: https://github.com/apache/spark/pull/31793#discussion_r590996815
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/CustomShuffleReaderExec.scala
##########
@@ -179,12 +179,12 @@ case class CustomShuffleReaderExec private(
}
private lazy val shuffleRDD: RDD[_] = {
- sendDriverMetrics()
Review comment:
I looked into writing a test. I would like to do something like this in
`AdaptiveQueryExecSuite`:
```scala
test("operating on canonicalized plan") {
withSQLConf(SQLConf.ADAPTIVE_EXECUTION_ENABLED.key -> "true") {
val (_, adaptivePlan) = runAdaptiveAndVerifyResult(
"SELECT key FROM testData GROUP BY key")
val readers = collect(adaptivePlan) {
case r: CustomShuffleReaderExec => r
}
assert(readers.length == 1)
val reader = readers.head
val ex = intercept[IllegalStateException] {
val c = reader.canonicalized.asInstanceOf[CustomShuffleReaderExec]
c.doExecute()
}
assert(ex.getMessage === "operating on canonicalized plan")
}
}
```
The problem is that `doExecute` is protected, and I can't test by calling
`execute` because it has separate checks for canonicalized plans.
What would be the recommended approach here?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]