Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/20179#discussion_r160481843
  
    --- Diff: 
core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala ---
    @@ -196,11 +196,24 @@ private[spark] class IndexShuffleBlockResolver(
         // find out the consolidated file, then the offset within that from 
our index
         val indexFile = getIndexFile(blockId.shuffleId, blockId.mapId)
     
    -    val in = new DataInputStream(new FileInputStream(indexFile))
    +    // SPARK-22982: if this FileInputStream's position is seeked forward 
by another piece of code
    +    // which is incorrectly using our file descriptor then this code will 
fetch the wrong offsets
    +    // (which may cause a reducer to be sent a different reducer's data). 
The explicit position
    +    // checks added here were a useful debugging aid during SPARK-22982 
and may help prevent this
    +    // class of issue from re-occurring in the future which is why they 
are left here even though
    +    // SPARK-22982 is fixed.
    +    val channel = Files.newByteChannel(indexFile.toPath)
    +    channel.position(blockId.reduceId * 8)
    +    val in = new DataInputStream(Channels.newInputStream(channel))
         try {
    -      ByteStreams.skipFully(in, blockId.reduceId * 8)
           val offset = in.readLong()
           val nextOffset = in.readLong()
    +      val actualPosition = channel.position()
    +      val expectedPosition = blockId.reduceId * 8 + 16
    +      if (actualPosition != expectedPosition) {
    +        throw new Exception(s"SPARK-22982: Incorrect channel position 
after index file reads: " +
    --- End diff --
    
    Any suggestions for a better exception subtype? I don't expect this to be a 
recoverable error and wanted to avoid the possibility that downstream code 
catches and handles this error. Maybe I should go further and make it a 
RuntimeException to make it even more fatal?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to