[ https://issues.apache.org/jira/browse/SPARK-19006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-19006. ------------------------------- Resolution: Fixed Fix Version/s: 2.2.0 Issue resolved by pull request 16412 [https://github.com/apache/spark/pull/16412] > should mentioned the max value allowed for spark.kryoserializer.buffer.max in > doc > --------------------------------------------------------------------------------- > > Key: SPARK-19006 > URL: https://issues.apache.org/jira/browse/SPARK-19006 > Project: Spark > Issue Type: Documentation > Reporter: Yuexin Zhang > Priority: Trivial > Fix For: 2.2.0 > > > On configuration doc > page:https://spark.apache.org/docs/latest/configuration.html > We mentioned spark.kryoserializer.buffer.max : Maximum allowable size of Kryo > serialization buffer. This must be larger than any object you attempt to > serialize. Increase this if you get a "buffer limit exceeded" exception > inside Kryo. > from source code, it has hard coded upper limit : > val maxBufferSizeMb = conf.getSizeAsMb("spark.kryoserializer.buffer.max", > "64m").toInt > if (maxBufferSizeMb >= ByteUnit.GiB.toMiB(2)) { > throw new IllegalArgumentException("spark.kryoserializer.buffer.max must > be less than " + > s"2048 mb, got: + $maxBufferSizeMb mb.") > } > We should mention "this value must be less than 2048 mb" on the config page > as well. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org