MaxGekk commented on code in PR #49628:
URL: https://github.com/apache/spark/pull/49628#discussion_r1928877897


##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -42,6 +42,7 @@ import org.apache.spark.sql.connect.client.{RetryPolicy, 
SparkConnectClient, Spa
 import org.apache.spark.sql.functions._
 import org.apache.spark.sql.internal.SqlApiConf
 import org.apache.spark.sql.test.{ConnectFunSuite, IntegrationTestUtils, 
RemoteSparkSession, SQLHelper}
+import org.apache.spark.sql.test.QueryTest.checkAnswer

Review Comment:
   Please, remove this and extend `QueryTest` instead.



##########
connector/connect/client/jvm/src/test/scala/org/apache/spark/sql/ClientE2ETestSuite.scala:
##########
@@ -1629,6 +1630,37 @@ class ClientE2ETestSuite
       .create()
     assert(sparkWithLowerMaxMessageSize.range(maxBatchSize).collect().length 
== maxBatchSize)
   }
+
+  test("Multiple positional parameterized nodes in the parsed logical plan") {
+    var df = spark.sql("SELECT ?", Array(0))
+    for (i <- 1 until 10) {
+      val temp = spark.sql("SELECT ?", Array(i))
+      df = df.union(temp)
+    }
+    checkAnswer(df, (0 until 10).map(i => Row(i)))

Review Comment:
   Let's follow the existing convention, and extend `QueryTest` in which 
Spark's session can be bound to a remote Spark session.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to