012huang commented on a change in pull request #27064: 
[SPARK-30246]OneForOneStreamManager might leak memory in connectionTerminated
URL: https://github.com/apache/spark/pull/27064#discussion_r362721699
 
 

 ##########
 File path: 
common/network-common/src/main/java/org/apache/spark/network/server/OneForOneStreamManager.java
 ##########
 @@ -117,18 +118,21 @@ public static String genStreamChunkId(long streamId, int 
chunkId) {
 
   @Override
   public void connectionTerminated(Channel channel) {
+    LinkedList<StreamState> removedStates = new LinkedList<>();
     // Close all streams which have been associated with the channel.
     for (Map.Entry<Long, StreamState> entry: streams.entrySet()) {
       StreamState state = entry.getValue();
       if (state.associatedChannel == channel) {
-        streams.remove(entry.getKey());
-
-        // Release all remaining buffers.
-        while (state.buffers.hasNext()) {
-          ManagedBuffer buffer = state.buffers.next();
-          if (buffer != null) {
-            buffer.release();
-          }
+        removedStates.add(streams.remove(entry.getKey()));
+      }
+    }
+
+    for (StreamState state: removedStates) {
+      // Release all remaining buffers.
 
 Review comment:
   there exists an case, if the application has already finished(failed), 
executors' info of the app has been cleaned, but the connectionTerminated be 
called delay, when execute this code, it will look up the buffer with `appId` 
and `execId` , so the buffer cannot be got and released and still stay remain. 
this happend in spark 2.4.3 as I report and I also make a pr, pls help 
review(https://github.com/apache/spark/pull/27064), thanks

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to