Douglas Colkitt created SPARK-28009:
---------------------------------------

             Summary: PipedRDD: Block not locked for reading failure
                 Key: SPARK-28009
                 URL: https://issues.apache.org/jira/browse/SPARK-28009
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.4.0
         Environment: Running in a Docker container with Spark 2.4.0 on Linux 
kernel 4.9.0
            Reporter: Douglas Colkitt


PipedRDD operation fails with the below stack trace. Failure primarily occurs 
when the STDOUT from the Unix process is small and the STDIN into the Unix 
process is comparatively much larger.

 

Given the similarity to SPARK-18406, this seems to be due to a race condition 
when it comes to accessing the block's reader locker. The PipedRDD class 
implementation spawns STDIN iterator in a separate thread, so that would 
corroborate the race condition hypothesis.

 

at scala.Predef$.assert(Predef.scala:170)

at org.apache.spark.storage.BlockInfoManager.unlock(BlockInfoManager.scala:299)

at org.apache.spark.storage.BlockManager.releaseLock(BlockManager.scala:842)

at 
org.apache.spark.storage.BlockManager.releaseLockAndDispose(BlockManager.scala:1610)

at 
org.apache.spark.storage.BlockManager$$anonfun$2.apply$mcV$sp(BlockManager.scala:621)

at 
org.apache.spark.util.CompletionIterator$$anon$1.completion(CompletionIterator.scala:44)

at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:33)

at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)

at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:462)

at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at org.apache.spark.rdd.PipedRDD$$anon$3.run(PipedRDD.scala:145)

Suppressed: java.lang.AssertionError: assertion failed

at scala.Predef$.assert(Predef.scala:156)

at org.apache.spark.storage.BlockInfo.checkInvariants(BlockInfoManager.scala:84)

at org.apache.spark.storage.BlockInfo.readerCount_$eq(BlockInfoManager.scala:66)

at 
org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$2$$anonfun$apply$2.apply(BlockInfoManager.scala:363)

at 
org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$2$$anonfun$apply$2.apply(BlockInfoManager.scala:362)

at scala.Option.foreach(Option.scala:257)

at 
org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$2.apply(BlockInfoManager.scala:362)

at 
org.apache.spark.storage.BlockInfoManager$$anonfun$releaseAllLocksForTask$2.apply(BlockInfoManager.scala:358)

at scala.collection.Iterator$class.foreach(Iterator.scala:891)

at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)

at 
org.apache.spark.storage.BlockInfoManager.releaseAllLocksForTask(BlockInfoManager.scala:358)

at 
org.apache.spark.storage.BlockManager.releaseAllLocksForTask(BlockManager.scala:858)

at 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$1.apply$mcV$sp(Executor.scala:409)

at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1369)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)

at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

at java.lang.Thread.run(Thread.java:748)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to