kazuyukitanimura commented on code in PR #407: URL: https://github.com/apache/datafusion-comet/pull/407#discussion_r1602251565
########## spark/src/main/scala/org/apache/spark/sql/comet/execution/shuffle/CometShuffleExchangeExec.scala: ########## @@ -482,8 +483,10 @@ class CometShuffleWriteProcessor( "elapsed_compute" -> metrics("shuffleReadElapsedCompute")) val nativeMetrics = CometMetricNode(nativeSQLMetrics) - val rawIter = cometRDD.iterator(partition, context) - val cometIter = CometExec.getCometIterator(Seq(rawIter), nativePlan, nativeMetrics) + val cometIter = CometExec.getCometIterator( + Seq(inputs.asInstanceOf[Iterator[ColumnarBatch]]), Review Comment: We have `ShimCometShuffleWriteProcessor.write()` for Spark 3.x Yes, currently this may not be correct for Spark 4.0. This will be followup with test fixes. I will ask for your help about how to work around the interface change (or change Spark in the upstream). -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For additional commands, e-mail: github-h...@datafusion.apache.org