maropu commented on a change in pull request #27246:
URL: https://github.com/apache/spark/pull/27246#discussion_r441909716
##########
File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
##########
@@ -1238,6 +1239,18 @@ package object config {
s"The value must be in allowed range [1,048,576,
${MAX_BUFFER_SIZE_BYTES}].")
.createWithDefault(1024 * 1024)
+ private[spark] val UNSAFE_SORTER_SPILL_READER_BUFFER_SIZE_RATIO =
+ ConfigBuilder("spark.unsafe.sorter.spill.reader.buffer.size.ratio")
Review comment:
I originally meant;
```
// package.scala
private[spark] val UNSAFE_SORTER_SPILL_READER_BUFFER_DEFAULT_SIZE_BYTES =
ConfigBuilder("spark.unsafe.sorter.spill.reader.buffer.defaultSize")
.internal()
.bytesConf(ByteUnit.BYTE)
.checkValue(...)
.createWithDefault(1024 * 1024);
// UnsafeSorterSpillReader.java
private byte[] arr = null;
public UnsafeSorterSpillReader(
SerializerManager serializerManager,
File file,
BlockId blockId) throws IOException {
assert (file.length() > 0);
int defaultBufferSize =
(int)SparkEnv.get().conf().get(package$.MODULE$.UNSAFE_SORTER_SPILL_READER_BUFFER_SIZE_BYTES())
this.arr = new byte[DEFAULT_BUFFER_SIZE_RATIO];
```
Am I miss something?
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]