Copilot commented on code in PR #53817:
URL: https://github.com/apache/spark/pull/53817#discussion_r2705637351


##########
sql/core/src/main/scala/org/apache/spark/sql/execution/adaptive/AQEShuffleReadExec.scala:
##########
@@ -278,4 +278,13 @@ case class AQEShuffleReadExec private(
 
   override protected def withNewChildInternal(newChild: SparkPlan): 
AQEShuffleReadExec =
     copy(child = newChild)
+
+  override def simpleStringWithNodeId(): String = {
+    if (stringArgs.nonEmpty) {
+      super.simpleStringWithNodeId() + ", " + stringArgs.mkString(", ")
+    } else {
+      super.simpleStringWithNodeId()
+    }
+  }

Review Comment:
   The check `stringArgs.nonEmpty` will always return true because `stringArgs` 
returns an Iterator with a single element (which may be an empty string). When 
there are no special properties, `stringArgs` returns `Iterator("")`, so this 
check will still evaluate to true, resulting in output like "AQEShuffleRead 
(6), " with a trailing comma and space. 
   
   The condition should check if the string content is non-empty, not just if 
the iterator is non-empty. Consider using 
`stringArgs.filter(_.toString.nonEmpty)` to filter out empty strings before 
checking, or restructure the logic in `stringArgs` to return an empty Iterator 
when there are no properties.



##########
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:
##########
@@ -3372,6 +3372,57 @@ class AdaptiveQueryExecSuite
     }
   }
 
+  test("SPARK-55052: Verify exposed AQEShuffleRead properties in Physical Plan 
Tree") {
+    withSQLConf(
+      SQLConf.ADAPTIVE_EXECUTION_ENABLED.key -> "true",
+      SQLConf.AUTO_BROADCASTJOIN_THRESHOLD.key -> "-1",
+      SQLConf.SKEW_JOIN_SKEWED_PARTITION_THRESHOLD.key -> "100",
+      SQLConf.ADVISORY_PARTITION_SIZE_IN_BYTES.key -> "100",
+      SQLConf.SHUFFLE_PARTITIONS.key -> "10") {
+      withTempView("view1", "skewDataView2") {
+        spark
+          .range(0, 1000, 1, 10)
+          .selectExpr("id % 10 as key1")
+          .createOrReplaceTempView("view1")
+        spark
+          .range(0, 200, 1, 10)
+          .selectExpr("id % 1 as key2")
+          .createOrReplaceTempView("skewDataView2")
+
+        val df = spark.sql("SELECT key1 FROM view1 JOIN skewDataView2 ON key1 
= key2")
+        df.collect()
+        val explainPlanString = 
df.queryExecution.explainString(ExplainMode.fromString("FORMATTED"))
+
+        assert(explainPlanString.contains(getExpectedPhysicalPlanSubstring()),
+          "Physical Plan should include expected Physical Plan Substring")
+      }
+    }
+  }

Review Comment:
   The test only validates the cases where AQEShuffleRead has properties 
(coalesced, coalesced and skewed), but doesn't test the case where 
AQEShuffleRead has no special properties (no local/coalesced/skewed). Consider 
adding a test case that validates the output when there are no special 
properties to ensure the formatting is correct in that scenario.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to