dongjoon-hyun commented on code in PR #49628:
URL: https://github.com/apache/spark/pull/49628#discussion_r1929479968


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/ClientE2ETestSuite.scala:
##########
@@ -1631,6 +1633,34 @@ class ClientE2ETestSuite
       .create()
     assert(sparkWithLowerMaxMessageSize.range(maxBatchSize).collect().length 
== maxBatchSize)
   }
+
+  test("Multiple positional parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT ?", Array(0))
+    for (i <- 1 until 10) {

Review Comment:
   Since we don't need `10`, do you think we can reduce this to `2` or `3`?



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/ClientE2ETestSuite.scala:
##########
@@ -1631,6 +1633,34 @@ class ClientE2ETestSuite
       .create()
     assert(sparkWithLowerMaxMessageSize.range(maxBatchSize).collect().length 
== maxBatchSize)
   }
+
+  test("Multiple positional parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT ?", Array(0))
+    for (i <- 1 until 10) {
+      val temp = spark.sql("SELECT ?", Array(i))
+      df = df.union(temp)
+    }
+    checkAnswer(df, (0 until 10).map(i => Row(i)))
+  }
+
+  test("Multiple named parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT :key", args = Map("key" -> 0))
+    for (i <- 1 until 10) {

Review Comment:
   ditto. `1 until 2` or `1 until 3`?



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/ClientE2ETestSuite.scala:
##########
@@ -1631,6 +1633,34 @@ class ClientE2ETestSuite
       .create()
     assert(sparkWithLowerMaxMessageSize.range(maxBatchSize).collect().length 
== maxBatchSize)
   }
+
+  test("Multiple positional parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT ?", Array(0))
+    for (i <- 1 until 10) {
+      val temp = spark.sql("SELECT ?", Array(i))
+      df = df.union(temp)
+    }
+    checkAnswer(df, (0 until 10).map(i => Row(i)))
+  }
+
+  test("Multiple named parameterized nodes in the parsed logical plan") {

Review Comment:
   ```scala
   - test("Multiple named parameterized nodes in the parsed logical plan") {
   + test("SPARK-50965: Multiple named parameterized nodes in the parsed 
logical plan") {
   ```



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/ClientE2ETestSuite.scala:
##########
@@ -42,14 +42,16 @@ import org.apache.spark.sql.catalyst.parser.ParseException
 import org.apache.spark.sql.connect.ConnectConversions._
 import org.apache.spark.sql.connect.client.{RetryPolicy, SparkConnectClient, 
SparkResult}
 import org.apache.spark.sql.connect.test.{ConnectFunSuite, 
IntegrationTestUtils, RemoteSparkSession, SQLHelper}
+import org.apache.spark.sql.connect.test.QueryTest

Review Comment:
   Please merge this line into the above line, @viktorluc-db .



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/ClientE2ETestSuite.scala:
##########
@@ -1631,6 +1633,34 @@ class ClientE2ETestSuite
       .create()
     assert(sparkWithLowerMaxMessageSize.range(maxBatchSize).collect().length 
== maxBatchSize)
   }
+
+  test("Multiple positional parameterized nodes in the parsed logical plan") {

Review Comment:
   If you don't, could you add a test name prefix, `SPARK-50965: `, please?



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/connect/ClientE2ETestSuite.scala:
##########
@@ -1631,6 +1633,34 @@ class ClientE2ETestSuite
       .create()
     assert(sparkWithLowerMaxMessageSize.range(maxBatchSize).collect().length 
== maxBatchSize)
   }
+
+  test("Multiple positional parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT ?", Array(0))
+    for (i <- 1 until 10) {
+      val temp = spark.sql("SELECT ?", Array(i))
+      df = df.union(temp)
+    }
+    checkAnswer(df, (0 until 10).map(i => Row(i)))
+  }
+
+  test("Multiple named parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT :key", args = Map("key" -> 0))
+    for (i <- 1 until 10) {
+      val temp = spark.sql("SELECT :key", args = Map("key" -> i))
+      df = df.union(temp)
+    }
+    checkAnswer(df, (0 until 10).map(i => Row(i)))
+  }
+
+  test("Multiple named and positional parameterized nodes in the parsed 
logical plan") {

Review Comment:
   ditto for test name prefix.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to