Hi rajat

I’m guessing you are setting the configuration at runtime, and correct me
if I’m wrong.
Only certain subset of Spark SQL properties (prefixed with spark.sql) can
be set on runtime, please refer to SparkConf.scala
<https://github.com/apache/spark/blob/500f3097111a6bf024acf41400660c199a150350/core/src/main/scala/org/apache/spark/SparkConf.scala#L51-L52>

Once a SparkConf object is passed to Spark, it is cloned and can no longer
be modified by the user. Spark does not support modifying the configuration
at runtime.

So, remaining options have to be set before SparkContext is initalized.

val spark = SparkSession.builder.config("spark.serializer",
"org.apache.spark.serializer.KryoSerializer"").getOrCreate


rajat kumar <kumar.rajat20...@gmail.com> 于2022年9月23日周五 05:58写道:

> Hello Users,
>
> While using below setting getting exception
>   spark.conf.set("spark.serializer",
> "org.apache.spark.serializer.KryoSerializer")
>
> User class threw exception: org.apache.spark.sql.AnalysisException: Cannot
> modify the value of a Spark config: spark.serializer at
> org.apache.spark.sql.errors.QueryCompilationErrors$.cannotModifyValueOfSparkConfigError(QueryCompilationErrors.scala:2322)
>
> Can we safely skip setting it or is there any changed way?
>
> Thanks
> Rajat
>
>
>

-- 
Best!
Qian SUN

Reply via email to