HeartSaVioR commented on a change in pull request #23532: [SPARK-26466][CORE]
Use ConfigEntry for hardcoded configs for submit categories.
URL: https://github.com/apache/spark/pull/23532#discussion_r247415987
##########
File path:
mllib/src/test/scala/org/apache/spark/mllib/feature/Word2VecSuite.scala
##########
@@ -109,12 +111,16 @@ class Word2VecSuite extends SparkFunSuite with
MLlibTestSparkContext {
test("big model load / save") {
// backupping old values
- val oldBufferConfValue = spark.conf.get("spark.kryoserializer.buffer.max",
"64m")
- val oldBufferMaxConfValue = spark.conf.get("spark.kryoserializer.buffer",
"64k")
+ val oldBufferConfValue = spark.conf.get(KRYO_SERIALIZER_BUFFER_SIZE.key,
"64m")
+ val oldBufferMaxConfValue =
spark.conf.get(KRYO_SERIALIZER_MAX_BUFFER_SIZE.key, "64k")
+ val oldSetCommandRejectsSparkCoreConfs = spark.conf.get(
+ SET_COMMAND_REJECTS_SPARK_CORE_CONFS.key, "true")
// setting test values to trigger partitioning
- spark.conf.set("spark.kryoserializer.buffer", "50b")
- spark.conf.set("spark.kryoserializer.buffer.max", "50b")
+
+ // this is needed to set configurations which are also defined to SparkConf
+ spark.conf.set(SET_COMMAND_REJECTS_SPARK_CORE_CONFS.key, "false")
Review comment:
I had to set this to `false` because by default it doesn't allow setting
configuration which is defined in SparkConf. I'm not fully sure which one is
right to do: 1) make the change as I just did 2) don't add
`spark.kryoserializer.buffer` and `spark.kryoserializer.buffer.max` to Kryo.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]