Github user tejasapatil commented on a diff in the pull request:
https://github.com/apache/spark/pull/20226#discussion_r166425954
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/ExistingRDD.scala ---
@@ -103,6 +103,8 @@ case class ExternalRDDScanExec[T](
override lazy val metrics = Map(
"numOutputRows" -> SQLMetrics.createMetric(sparkContext, "number of
output rows"))
+ override val nodeName: String = s"Scan ExternalRDD
${output.map(_.name).mkString("[", ",", "]")}"
--- End diff --
My intention here was to be able to distinguish between
`ExternalRDDScanExec` nodes. If we remove the `output` part from `nodename`,
then these nodes would be named as `Scan ExternalRDD` which is generic.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]