[ 
https://issues.apache.org/jira/browse/SPARK-31532?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17091067#comment-17091067
 ] 

JinxinTang edited comment on SPARK-31532 at 4/24/20, 1:08 AM:
--------------------------------------------------------------

Thanks for your issue, these followings may not be allowed to modify after 
sparksession startup by design:

[spark.sql.codegen.comments, spark.sql.queryExecutionListeners, 
spark.sql.catalogImplementation, spark.sql.subquery.maxThreadThreshold, 
spark.sql.globalTempDatabase, spark.sql.codegen.cache.maxEntries, 
spark.sql.filesourceTableRelationCacheSize, 
spark.sql.streaming.streamingQueryListeners, spark.sql.ui.retainedExecutions, 
spark.sql.hive.thriftServer.singleSession, spark.sql.extensions, 
spark.sql.debug, spark.sql.sources.schemaStringLengthThreshold, 
spark.sql.warehouse.dir] 

So it is might not a bug.


was (Author: jinxintang):
Thanks for your issue, these followings not be modified after sparksession 
startup by design:

[spark.sql.codegen.comments, spark.sql.queryExecutionListeners, 
spark.sql.catalogImplementation, spark.sql.subquery.maxThreadThreshold, 
spark.sql.globalTempDatabase, spark.sql.codegen.cache.maxEntries, 
spark.sql.filesourceTableRelationCacheSize, 
spark.sql.streaming.streamingQueryListeners, spark.sql.ui.retainedExecutions, 
spark.sql.hive.thriftServer.singleSession, spark.sql.extensions, 
spark.sql.debug, spark.sql.sources.schemaStringLengthThreshold, 
spark.sql.warehouse.dir] 

> SparkSessionBuilder shoud not propagate static sql configurations to the 
> existing active/default SparkSession
> -------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-31532
>                 URL: https://issues.apache.org/jira/browse/SPARK-31532
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.2, 2.1.3, 2.2.3, 2.3.4, 2.4.5, 3.0.0, 3.1.0
>            Reporter: Kent Yao
>            Priority: Major
>
> Clearly, this is a bug.
> {code:java}
> scala> spark.sql("set spark.sql.warehouse.dir").show
> +--------------------+--------------------+
> |                 key|               value|
> +--------------------+--------------------+
> |spark.sql.warehou...|file:/Users/kenty...|
> +--------------------+--------------------+
> scala> spark.sql("set spark.sql.warehouse.dir=2");
> org.apache.spark.sql.AnalysisException: Cannot modify the value of a static 
> config: spark.sql.warehouse.dir;
>   at 
> org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:154)
>   at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:42)
>   at 
> org.apache.spark.sql.execution.command.SetCommand.$anonfun$x$7$6(SetCommand.scala:100)
>   at 
> org.apache.spark.sql.execution.command.SetCommand.run(SetCommand.scala:156)
>   at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
>   at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
>   at 
> org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
>   at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
>   at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3644)
>   at 
> org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
>   at 
> org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
>   at 
> org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
>   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
>   at 
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
>   at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3642)
>   at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
>   at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
>   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
>   at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
>   at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:607)
>   at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:764)
>   at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:602)
>   ... 47 elided
> scala> import org.apache.spark.sql.SparkSession
> import org.apache.spark.sql.SparkSession
> scala> SparkSession.builder.config("spark.sql.warehouse.dir", "xyz").get
> getClass   getOrCreate
> scala> SparkSession.builder.config("spark.sql.warehouse.dir", 
> "xyz").getOrCreate
> 20/04/23 23:49:13 WARN SparkSession$Builder: Using an existing SparkSession; 
> some configuration may not take effect.
> res7: org.apache.spark.sql.SparkSession = 
> org.apache.spark.sql.SparkSession@6403d574
> scala> spark.sql("set spark.sql.warehouse.dir").show
> +--------------------+-----+
> |                 key|value|
> +--------------------+-----+
> |spark.sql.warehou...|  xyz|
> +--------------------+-----+
> scala>
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to