wForget opened a new issue, #11539:
URL: https://github.com/apache/incubator-gluten/issues/11539
### Backend
VL (Velox)
### Bug description
When adding `spark.io.compression.code=snappy` configuration to the spark
job, I get an error:
```
26/02/01 22:48:05 ERROR TaskResources: Task 2 failed by error:
java.lang.IllegalArgumentException: The value of spark.io.compression.codec
should be one of lz4, zstd, but was snappy
at
org.apache.spark.shuffle.GlutenShuffleUtils$.checkCodecValues$1(GlutenShuffleUtils.scala:56)
at
org.apache.spark.shuffle.GlutenShuffleUtils$.getCompressionCodec(GlutenShuffleUtils.scala:81)
at
org.apache.spark.shuffle.GlutenShuffleUtils.getCompressionCodec(GlutenShuffleUtils.scala)
at
org.apache.spark.shuffle.writer.VeloxUniffleColumnarShuffleWriter.<init>(VeloxUniffleColumnarShuffleWriter.java:131)
at
org.apache.spark.shuffle.gluten.uniffle.UniffleShuffleManager.getWriter(UniffleShuffleManager.java:78)
at
org.apache.spark.shuffle.QiyiRssShuffleManager.getWriter(QiyiRssShuffleManager.java:249)
at
org.apache.spark.shuffle.ShuffleWriteProcessor.write(ShuffleWriteProcessor.scala:57)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:104)
at
org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:54)
at
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:161)
at org.apache.spark.scheduler.Task.run(Task.scala:141)
at
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
at
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
at
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
```
We can configure `spark.gluten.sql.columnar.shuffle.codec=zstd` to avoid
this problem, but I think we shouldn't fail job but instead fallback the
shuffle exchange.
### Gluten version
Gluten-1.5
### Spark version
Spark-3.5.x
### Spark configurations
_No response_
### System information
_No response_
### Relevant logs
```bash
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]