Hi all, 
I would like to configure the following setting during runtime as below:

spark = (SparkSession
    .builder
    .appName("ElasticSearchIndex")
    .config("spark.kryoserializer.buffer.max", "1g")
    .getOrCreate())

But I still hit error, 
Caused by: org.apache.spark.SparkException: Kryo serialization failed:
Buffer overflow. Available: 0, required: 1614707. To avoid this, increase
spark.kryoserializer.buffer.max value.

It works when configure it along with the spark-submit command as below:
spark-submit pyspark-shell-main --name "PySparkShell" --conf
spark.kryoserializer.buffer.max=1g

Any idea what have I done wrong?

Thank you.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Configure-spark-kryoserializer-buffer-max-at-runtime-does-not-take-effect-tp28094.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to