sunhaibotb commented on a change in pull request #7797: [FLINK-11379] Fix
OutOfMemoryError caused by Files.readAllBytes() when TM loads a large size TDD
URL: https://github.com/apache/flink/pull/7797#discussion_r259784291
##########
File path: flink-core/src/main/java/org/apache/flink/util/FileUtils.java
##########
@@ -56,6 +60,15 @@
/** The length of the random part of the filename. */
private static final int RANDOM_FILE_NAME_LENGTH = 12;
+ /**
+ * The maximum size of array to allocate for reading. See
+ * {@link java.nio.file.Files#MAX_BUFFER_SIZE} for more.
+ */
+ private static final int MAX_BUFFER_SIZE = Integer.MAX_VALUE - 8;
+
+ /** The size of the buffer used for reading. */
+ private static final int BUFFER_SIZE = 4096;
Review comment:
The sun.misc package is Sun's dedicated api, and it seems that it should be
avoided to use as much as possible. Currently in most cases, the size of direct
memory should be larger than 4K.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services