akpatnam25 commented on PR #36601: URL: https://github.com/apache/spark/pull/36601#issuecomment-1139832538
@mridulm this is the test failure: `[error] /home/runner/work/spark/spark/core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala:831:46: type mismatch; [error] found : Object [error] required: java.io.InputStream [error] input = streamWrapper(blockId, in) [error] ^ [error] /home/runner/work/spark/spark/core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala:892:20: value close is not a member of Object [error] in.close() [error] ^ [error] /home/runner/work/spark/spark/core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala:1321:43: type mismatch; [error] found : Throwable [error] required: T [error] iterator.throwFetchFailedException(blockId, mapIndex, address, e, diagnosisResponse) [error] ^ [info] Throwable <: T? [info] false [error] three errors found [error] (core / Compile / compileIncremental) Compilation failed [error] Total time: 235 s (03:55), completed May 26, 2022 9:15:55 PM [error] running /home/runner/work/spark/spark/build/sbt -Phadoop-3 -Pspark-ganglia-lgpl -Pmesos -Pkinesis-asl -Phive -Pdocker-integration-tests -Phive-thriftserver -Pkubernetes -Phadoop-cloud -Pyarn test:package streaming-kinesis-asl-assembly/assembly ; received return code 1 Error: Process completed with exit code 16.` I did not touch that code in this pull request, and seems to already be there. Not sure if something is wrong with my github actions setup or anything though. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org