HyukjinKwon commented on a change in pull request #29146:
URL: https://github.com/apache/spark/pull/29146#discussion_r461281062
##########
File path:
sql/core/src/main/scala/org/apache/spark/sql/execution/SparkSqlParser.scala
##########
@@ -65,16 +65,19 @@ class SparkSqlAstBuilder(conf: SQLConf) extends
AstBuilder(conf) {
*/
override def visitSetConfiguration(ctx: SetConfigurationContext):
LogicalPlan = withOrigin(ctx) {
// Construct the command.
- val raw = remainder(ctx.SET.getSymbol)
- val keyValueSeparatorIndex = raw.indexOf('=')
Review comment:
Yup, I am okay with that. I thought it'd be easier to use the same codes
so we can just say `SET` expects the same syntax as specified in
`spark-default.conf`. But using strings look more natural in SQL context. I am
okay either way.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]