squito commented on a change in pull request #25342: 
[SPARK-28571][CORE][SHUFFLE] Use the shuffle writer plugin for the 
SortShuffleWriter
URL: https://github.com/apache/spark/pull/25342#discussion_r317881200
 
 

 ##########
 File path: 
core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala
 ##########
 @@ -670,11 +672,9 @@ private[spark] class ExternalSorter[K, V, C](
   }
 
   /**
-   * Write all the data added into this ExternalSorter into a file in the disk 
store. This is
-   * called by the SortShuffleWriter.
-   *
-   * @param blockId block ID to write to. The index file will be blockId.name 
+ ".index".
-   * @return array of lengths, in bytes, of each partition of the file (used 
by map output tracker)
+   * TODO(SPARK-28764): remove this, as this is only used by 
UnsafeRowSerializerSuite in the SQL
 
 Review comment:
   can't that test just call `sorter.writePartitionedMapOutput(..., new 
LocalDiskMapOutputWriter(...))` ?  Anyway fine to leave it for the follow up 
jira.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to