dusantism-db commented on code in PR #52173:
URL: https://github.com/apache/spark/pull/52173#discussion_r2325571346


##########
sql/core/src/main/scala/org/apache/spark/sql/classic/SparkSession.scala:
##########
@@ -509,6 +509,46 @@ class SparkSession private(
     sql(sqlText, args.asScala.toMap)
   }
 
+  /**
+   * Executes a SQL query substituting parameters by the given arguments with 
optional names,
+   * returning the result as a `DataFrame`. This method allows the inner query 
to determine
+   * whether to use positional or named parameters based on its parameter 
markers.
+   */
+  def sql(sqlText: String, args: Array[_], paramNames: Array[String]): 
DataFrame = {
+    sql(sqlText, args, paramNames, new QueryPlanningTracker)
+  }
+
+  /**
+   * Internal implementation of unified parameter API with tracker.
+   */
+  private[sql] def sql(
+      sqlText: String,
+      args: Array[_],
+      paramNames: Array[String],
+      tracker: QueryPlanningTracker): DataFrame =
+    withActive {
+      val plan = tracker.measurePhase(QueryPlanningTracker.PARSING) {
+        val parsedPlan = sessionState.sqlParser.parsePlan(sqlText)
+        if (args.nonEmpty) {
+          if (parsedPlan.isInstanceOf[CompoundBody]) {

Review Comment:
   I don't understand how this is fixed. When we call this method, we need to 
check whether positional parameters are supplied to a SQL script. Here we only 
check `args.nonEmpty`, which will be true whether we have positional or named 
paremeters. In effect, SQL scripts will regress to no longer support named 
parameters, as this error will be thrown every time.



##########
sql/core/src/main/scala/org/apache/spark/sql/classic/SparkSession.scala:
##########
@@ -509,6 +509,46 @@ class SparkSession private(
     sql(sqlText, args.asScala.toMap)
   }
 
+  /**
+   * Executes a SQL query substituting parameters by the given arguments with 
optional names,
+   * returning the result as a `DataFrame`. This method allows the inner query 
to determine
+   * whether to use positional or named parameters based on its parameter 
markers.
+   */
+  def sql(sqlText: String, args: Array[_], paramNames: Array[String]): 
DataFrame = {
+    sql(sqlText, args, paramNames, new QueryPlanningTracker)
+  }
+
+  /**
+   * Internal implementation of unified parameter API with tracker.
+   */
+  private[sql] def sql(
+      sqlText: String,
+      args: Array[_],
+      paramNames: Array[String],
+      tracker: QueryPlanningTracker): DataFrame =
+    withActive {
+      val plan = tracker.measurePhase(QueryPlanningTracker.PARSING) {
+        val parsedPlan = sessionState.sqlParser.parsePlan(sqlText)
+        if (args.nonEmpty) {
+          if (parsedPlan.isInstanceOf[CompoundBody]) {

Review Comment:
   I don't understand how this is fixed. When we call this method, we need to 
check whether positional parameters are supplied to a SQL script. Here we only 
check `args.nonEmpty`, which will be true whether we have positional or named 
paremeters. In effect, SQL scripts will regress to no longer support named 
parameters, as this error will be thrown every time.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to