srielau commented on code in PR #52173: URL: https://github.com/apache/spark/pull/52173#discussion_r2316616320
########## sql/api/src/main/scala/org/apache/spark/sql/SparkSession.scala: ########## @@ -523,6 +523,24 @@ abstract class SparkSession extends Serializable with Closeable { sql(sqlText, args.asScala.toMap) } + /** + * Executes a SQL query substituting parameters by the given arguments with optional names, + * returning the result as a `DataFrame`. This API eagerly runs DDL/DML commands, but not for + * SELECT queries. This method allows the inner query to determine whether to use positional + * or named parameters based on its parameter markers. + * + * @param sqlText + * A SQL statement with named or positional parameters to execute. + * @param args + * An array of Java/Scala objects that can be converted to SQL literal expressions. + * @param paramNames + * An optional array of parameter names corresponding to args. If provided, enables named + * parameter binding where parameter names are available. If None or shorter than args, + * remaining parameters are treated as positional. Review Comment: The idea was to generate [1, <something>, 3], [NULL, 'col2', NULL] If the actual usage is positional, great. If actual usage is by name => error: -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org