AngersZhuuuu opened a new pull request #34757:
URL: https://github.com/apache/spark/pull/34757
### What changes were proposed in this pull request?
In current pyspark, we have code as below
```
for key, value in self._options.items():
session._jsparkSession.sessionState().conf().setConfString(key, value)
return session
```
Here will pass all options to created/existed SparkSession, in Scala code
path, spark only pass non-static sql conf.
```
private def applyModifiableSettings(session: SparkSession): Unit = {
val (staticConfs, otherConfs) =
options.partition(kv => SQLConf.isStaticConfigKey(kv._1))
otherConfs.foreach { case (k, v) =>
session.sessionState.conf.setConfString(k, v) }
if (staticConfs.nonEmpty) {
logWarning("Using an existing SparkSession; the static sql
configurations will not take" +
" effect.")
}
if (otherConfs.nonEmpty) {
logWarning("Using an existing SparkSession; some spark core
configurations may not take" +
" effect.")
}
}
```
In this pr, we keep this behavior consistent
### Why are the changes needed?
Keep consistent behavior between pyspark and Scala code. when initialize
SparkSession, when their are existed Session, only overwrite non-static sql
conf.
### Does this PR introduce _any_ user-facing change?
User can't overwrite static sql conf when use pyspark with existed
SparkSession
### How was this patch tested?
Modefied UT
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]