cloud-fan commented on code in PR #52334:
URL: https://github.com/apache/spark/pull/52334#discussion_r2433609257
##########
sql/core/src/main/scala/org/apache/spark/sql/classic/SparkSession.scala:
##########
@@ -448,16 +448,24 @@ class SparkSession private(
private[sql] def sql(sqlText: String, args: Array[_], tracker:
QueryPlanningTracker): DataFrame =
withActive {
val plan = tracker.measurePhase(QueryPlanningTracker.PARSING) {
- val parsedPlan = sessionState.sqlParser.parsePlan(sqlText)
- if (args.nonEmpty) {
- // Check for SQL scripting with positional parameters before
creating parameterized query
- if (parsedPlan.isInstanceOf[CompoundBody]) {
+ val parsedPlan = if (args.nonEmpty) {
+ // Use parameter context directly for parsing
+ val paramContext =
PositionalParameterContext(args.map(lit(_).expr).toSeq)
Review Comment:
Yea we should at least resolve it here. I think it's fragile to pass
unresolved parameter expressions to the pre-parser. Can we use the fake plan to
resolve them? We also need to perform some validation as the previous parameter
binding framework does:
https://github.com/apache/spark/blob/b0285f8bbf8248ca5b9d9aebea087cb5037a4655/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/parameters.scala#L176-L188
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]