[ 
https://issues.apache.org/jira/browse/SPARK-32257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Xiao Li updated SPARK-32257:
----------------------------
    Description: 
{code:java}
SET spark.sql.ansi.enabled true{code}

 The above SQL command does not change the conf value and it just tries to 
display the value of conf "spark.sql.ansi.enabled true".

We can disallow using the space in the conf name and issue a user friendly 
error instead. In the error message, we should tell users a workaround to use 
the quote if they still needs to specify a conf with a space. 

 

  was:
SET spark.sql.ansi.enabled true
The above SQL command does not change the conf value and it just tries to 
display the value of conf "spark.sql.ansi.enabled true".

We can disallow using the space in the conf name and issue a user friendly 
error instead. In the error message, we should tell users a workaround to use 
the quote if they still needs to specify a conf with a space. 

 


> [SQL Parser] Report Error for invalid usage of SET command
> ----------------------------------------------------------
>
>                 Key: SPARK-32257
>                 URL: https://issues.apache.org/jira/browse/SPARK-32257
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Xiao Li
>            Priority: Major
>
> {code:java}
> SET spark.sql.ansi.enabled true{code}
>  The above SQL command does not change the conf value and it just tries to 
> display the value of conf "spark.sql.ansi.enabled true".
> We can disallow using the space in the conf name and issue a user friendly 
> error instead. In the error message, we should tell users a workaround to use 
> the quote if they still needs to specify a conf with a space. 
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to