Github user Ngone51 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21369#discussion_r189939603
  
    --- Diff: 
core/src/main/scala/org/apache/spark/util/collection/ExternalAppendOnlyMap.scala
 ---
    @@ -267,7 +273,7 @@ class ExternalAppendOnlyMap[K, V, C](
        */
       def destructiveIterator(inMemoryIterator: Iterator[(K, C)]): 
Iterator[(K, C)] = {
         readingIterator = new SpillableIterator(inMemoryIterator)
    -    readingIterator
    +    readingIterator.toCompletionIterator
    --- End diff --
    
    `destructiveIterator` should  just return a destructive iterator 
(especially for map buffer) as it's function name implies, and it it none 
business of `CompletionIterator `. And developers should be free to define the 
complete function for the returned destructive iterator, in case of we want a 
different one somewhere else in future.
    
     > Your suggested codes does exactly the same but is less streamlined 
    
    I don't think this little change will pay a huge influence on `streamlined 
`.
    
    > and relies on an intermediate value (fortunately it's already a member 
variable)
    
    The current fix leads to this, not me. And even this variable is not a 
member variable, we can define a temp local variable. It's not a big deal.
    
    
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to