Github user rajeshbalamohan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19184#discussion_r137973976
  
    --- Diff: 
core/src/main/java/org/apache/spark/util/collection/unsafe/sort/UnsafeSorterSpillReader.java
 ---
    @@ -104,6 +124,10 @@ public void loadNext() throws IOException {
         if (taskContext != null) {
           taskContext.killTaskIfInterrupted();
         }
    +    if (this.din == null) {
    +      // Good time to init (if all files are opened, we can get Too Many 
files exception)
    +      initStreams();
    +    }
    --- End diff --
    
    Good point. PR has been tried with queries involving window functions (e.g 
Q67) for which it worked fine. 
    
    During spill merges (esp getSortedIterator), it is possible to encounter 
too many open files issue.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to