Github user ilganeli commented on a diff in the pull request:
https://github.com/apache/spark/pull/5574#discussion_r28821264
--- Diff:
core/src/main/scala/org/apache/spark/serializer/KryoSerializer.scala ---
@@ -49,16 +49,17 @@ class KryoSerializer(conf: SparkConf)
with Logging
with Serializable {
- private val bufferSizeMb =
conf.getDouble("spark.kryoserializer.buffer.mb", 0.064)
- if (bufferSizeMb >= 2048) {
- throw new IllegalArgumentException("spark.kryoserializer.buffer.mb
must be less than " +
- s"2048 mb, got: + $bufferSizeMb mb.")
+ private val bufferSizeKb =
conf.getSizeAsKb("spark.kryoserializer.buffer", "64k")
--- End diff --
All - I don't know what the right solution here is. The old value simply
can't work using the new framework. Fractional values are no longer supported
and near as I can tell this is is the only instance of such usage. The only way
to truly maintain backwards compatibility (short of throwing an exception) is
to leave this as conf.getDouble but then this is an exception to the rule for
how we handle size variables.
This needs to be ```getSizeAsKb()``` and the value must be specified in the
right format, otherwise an exception is thrown.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]