[ https://issues.apache.org/jira/browse/SPARK-42834?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Li Ying closed SPARK-42834. --------------------------- > Divided by zero occurs in > PushBasedFetchHelper.createChunkBlockInfosFromMetaResponse > ------------------------------------------------------------------------------------ > > Key: SPARK-42834 > URL: https://issues.apache.org/jira/browse/SPARK-42834 > Project: Spark > Issue Type: Bug > Components: Shuffle > Affects Versions: 3.2.0 > Reporter: Li Ying > Priority: Major > > {color:#222222}Sometimes when run a SQL job with push based shuffle, > exception occurs as below. It seems that there’s no element in the bitmaps > which stores merge chunk meta. {color} > {color:#222222}Is it a bug that we should not createChunkBlockInfos when > bitmaps is empty or the bitmaps should never be empty here ?{color} > > {code:java} > Caused by: java.lang.ArithmeticException: / by zero > at > org.apache.spark.storage.PushBasedFetchHelper.createChunkBlockInfosFromMetaResponse(PushBasedFetchHelper.scala:117) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:980) > at > org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:84) > {code} > related code: > {code:java} > def createChunkBlockInfosFromMetaResponse( > shuffleId: Int, > shuffleMergeId: Int, > reduceId: Int, > blockSize: Long, > bitmaps: Array[RoaringBitmap]): ArrayBuffer[(BlockId, Long, Int)] = { > val approxChunkSize = blockSize / bitmaps.length > val blocksToFetch = new ArrayBuffer[(BlockId, Long, Int)]() > for (i <- bitmaps.indices) { > val blockChunkId = ShuffleBlockChunkId(shuffleId, shuffleMergeId, > reduceId, i) > chunksMetaMap.put(blockChunkId, bitmaps(i)) > logDebug(s"adding block chunk $blockChunkId of size $approxChunkSize") > blocksToFetch += ((blockChunkId, approxChunkSize, SHUFFLE_PUSH_MAP_ID)) > } > blocksToFetch > } {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org