viirya commented on pull request #29772:
URL: https://github.com/apache/spark/pull/29772#issuecomment-693863796


   > Currently, Spark determines whether UnsafeExternalSorter.SpillableIterator 
has spilled already by checking whether upstream is an instance of 
UnsafeInMemorySorter.SortedIterator
   
   Can we update the description a bit?
   
   This is reading like Spark thinks `UnsafeExternalSorter.SpillableIterator` 
has spilled already if `upstream` is `UnsafeInMemorySorter.SortedIterator`. But 
it actually is Spark thinks `UnsafeExternalSorter.SpillableIterator` has 
spilled already if `upstream` is not `UnsafeInMemorySorter.SortedIterator`, 
right?
   
   ```scala
   if (!(upstream instanceof UnsafeInMemorySorter.SortedIterator && 
nextUpstream == null
     && numRecords > 0)) {
     return 0L;
   }
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to