Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/15036#discussion_r78261243
  
    --- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala 
---
    @@ -1316,21 +1310,28 @@ private[spark] class BlockManager(
             // The block has already been removed; do nothing.
             logWarning(s"Asked to remove block $blockId, which does not exist")
           case Some(info) =>
    -        // Removals are idempotent in disk store and memory store. At 
worst, we get a warning.
    -        val removedFromMemory = memoryStore.remove(blockId)
    -        val removedFromDisk = diskStore.remove(blockId)
    -        if (!removedFromMemory && !removedFromDisk) {
    -          logWarning(s"Block $blockId could not be removed as it was not 
found in either " +
    -            "the disk, memory, or external block store")
    -        }
    -        blockInfoManager.removeBlock(blockId)
    -        val removeBlockStatus = getCurrentBlockStatus(blockId, info)
    -        if (tellMaster && info.tellMaster) {
    -          reportBlockStatus(blockId, info, removeBlockStatus)
    -        }
    -        Option(TaskContext.get()).foreach { c =>
    -          c.taskMetrics().incUpdatedBlockStatuses(blockId -> 
removeBlockStatus)
    -        }
    +        removeBlockInternal(blockId, info, tellMaster)
    +    }
    +  }
    +
    +  /**
    +   * Internal version of [[removeBlock()]] which assumes that the caller 
already holds a write
    +   * lock on the block.
    +   */
    +  private def removeBlockInternal(blockId: BlockId, info: BlockInfo, 
tellMaster: Boolean): Unit = {
    +    // Removals are idempotent in disk store and memory store. At worst, 
we get a warning.
    +    val removedFromMemory = memoryStore.remove(blockId)
    +    val removedFromDisk = diskStore.remove(blockId)
    +    if (!removedFromMemory && !removedFromDisk) {
    +      logWarning(s"Block $blockId could not be removed as it was not found 
in either " +
    --- End diff --
    
    Aha, good catch: the "external block store" part is outdated as of #10752, 
the patch which removed the block manager's "external block store API." This 
API was used by Spark 1.x's deeper Tachyon integration, which has been removed 
in Spark 2.x favor of interfacing through more standard filesystem APIs.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to