ChenRussell opened a new issue, #2082:
URL: https://github.com/apache/incubator-uniffle/issues/2082

   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   
   
   ### Search before asking
   
   - [X] I have searched in the 
[issues](https://github.com/apache/incubator-uniffle/issues?q=is%3Aissue) and 
found no similar issues.
   
   
   ### Describe the bug
   
   I use openjdk 11 in spark image, and I get errors when spark task reading 
shuffle data from uniffle server, here is the executor task error log:
   
![image](https://github.com/user-attachments/assets/a9d49e20-50a4-4c8f-8873-e9329b6fa744)
   
   
   ### Affects Version(s)
   
   0.9.0
   
   ### Uniffle Server Log Output
   
   ```logtalk
   no error message
   ```
   
   
   ### Uniffle Engine Log Output
   
   ```logtalk
   java.lang.reflect.InvocationTargetException
       at 
java.base/jdk.internal.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
       at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown 
Source)
       at java.base/java.lang.reflect.Method.invoke(Unknown Source)
       at 
org.apache.uniffle.io.netty.util.internal.CleanerJava9.freeDirectBuffer(CleanerJava9.java:88)
       at 
org.apache.uniffle.io.netty.util.internal.PlatformDependent.freeDirectBuffer(PlatformDependent.java:521)
       at 
org.apache.uniffle.common.util.RssUtils.releaseByteBuffer(RssUtils.java:414)
       at 
org.apache.uniffle.client.impl.ShuffleReadClientImpl.read(ShuffleReadClientImpl.java:314)
       at 
org.apache.uniffle.client.impl.ShuffleReadClientImpl.readShuffleBlockData(ShuffleReadClientImpl.java:216)
       at 
org.apache.spark.shuffle.reader.RssShuffleDataIterator.hasNext(RssShuffleDataIterator.java:115)
       at 
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31)
       at 
org.apache.spark.shuffle.reader.RssShuffleReader$MultiPartitionIterator.hasNext(RssShuffleReader.java:307)
       at 
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
       at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
       at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.hashAgg_doAggregateWithKeys_0$(Unknown
 Source)
       at 
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown
 Source)
       at 
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
       at 
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
       at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
       at scala.collection.Iterator.isEmpty(Iterator.scala:387)
       at scala.collection.Iterator.isEmpty$(Iterator.scala:387)
       at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1431)
       at scala.collection.TraversableOnce.nonEmpty(TraversableOnce.scala:143)
       at scala.collection.TraversableOnce.nonEmpty$(TraversableOnce.scala:143)
       at scala.collection.AbstractIterator.nonEmpty(Iterator.scala:1431)
       at org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1559)
       at 
org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2$adapted(RDD.scala:1558)
       at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:910)
       at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:910)
       at 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367)
       at org.apache.spark.rdd.RDD.iterator(RDD.scala:331)
       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)
       at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
       at org.apache.spark.scheduler.Task.run(Task.scala:141)
       at 
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
       at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
       at 
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
       at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
       at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown 
Source)
       at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown 
Source)
       at java.base/java.lang.Thread.run(Unknown Source)
       Suppressed: org.apache.spark.util.TaskCompletionListenerException: null
   
   Previous exception in task: null
       java.base/jdk.internal.reflect.GeneratedMethodAccessor24.invoke(Unknown 
Source)
       
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown 
Source)
       java.base/java.lang.reflect.Method.invoke(Unknown Source)
       
org.apache.uniffle.io.netty.util.internal.CleanerJava9.freeDirectBuffer(CleanerJava9.java:88)
       
org.apache.uniffle.io.netty.util.internal.PlatformDependent.freeDirectBuffer(PlatformDependent.java:521)
       
org.apache.uniffle.common.util.RssUtils.releaseByteBuffer(RssUtils.java:414)
       
org.apache.uniffle.client.impl.ShuffleReadClientImpl.read(ShuffleReadClientImpl.java:314)
       
org.apache.uniffle.client.impl.ShuffleReadClientImpl.readShuffleBlockData(ShuffleReadClientImpl.java:216)
       
org.apache.spark.shuffle.reader.RssShuffleDataIterator.hasNext(RssShuffleDataIterator.java:115)
       
org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31)
       
org.apache.spark.shuffle.reader.RssShuffleReader$MultiPartitionIterator.hasNext(RssShuffleReader.java:307)
       
org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
       scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
       
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.hashAgg_doAggregateWithKeys_0$(Unknown
 Source)
       
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage2.processNext(Unknown
 Source)
       
org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
       
org.apache.spark.sql.execution.WholeStageCodegenEvaluatorFactory$WholeStageCodegenPartitionEvaluator$$anon$1.hasNext(WholeStageCodegenEvaluatorFactory.scala:43)
       scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
       scala.collection.Iterator.isEmpty(Iterator.scala:387)
       scala.collection.Iterator.isEmpty$(Iterator.scala:387)
       scala.collection.AbstractIterator.isEmpty(Iterator.scala:1431)
       scala.collection.TraversableOnce.nonEmpty(TraversableOnce.scala:143)
       scala.collection.TraversableOnce.nonEmpty$(TraversableOnce.scala:143)
       scala.collection.AbstractIterator.nonEmpty(Iterator.scala:1431)
       org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2(RDD.scala:1559)
       org.apache.spark.rdd.RDD.$anonfun$takeOrdered$2$adapted(RDD.scala:1558)
       org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:910)
       
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:910)
       org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
       org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367)
       org.apache.spark.rdd.RDD.iterator(RDD.scala:331)
       org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)
       org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
       org.apache.spark.scheduler.Task.run(Task.scala:141)
       
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:620)
       
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
       
org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
       org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
       org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:623)
       java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown 
Source)
       java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown 
Source)
       java.base/java.lang.Thread.run(Unknown Source)
           at 
org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:254)
           at 
org.apache.spark.TaskContextImpl.invokeTaskCompletionListeners(TaskContextImpl.scala:144)
           at 
org.apache.spark.TaskContextImpl.markTaskCompleted(TaskContextImpl.scala:137)
           at 
org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:177)
           ... 9 more
           Suppressed: java.lang.reflect.InvocationTargetException
               at 
java.base/jdk.internal.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
               at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown 
Source)
               at java.base/java.lang.reflect.Method.invoke(Unknown Source)
               at 
org.apache.uniffle.io.netty.util.internal.CleanerJava9.freeDirectBuffer(CleanerJava9.java:88)
               at 
org.apache.uniffle.io.netty.util.internal.PlatformDependent.freeDirectBuffer(PlatformDependent.java:521)
               at 
org.apache.uniffle.common.util.RssUtils.releaseByteBuffer(RssUtils.java:414)
               at 
org.apache.uniffle.client.impl.ShuffleReadClientImpl.close(ShuffleReadClientImpl.java:335)
               at 
org.apache.spark.shuffle.reader.RssShuffleDataIterator.cleanup(RssShuffleDataIterator.java:217)
               at 
org.apache.spark.shuffle.reader.RssShuffleReader$MultiPartitionIterator.lambda$new$0(RssShuffleReader.java:284)
               at scala.Function0.apply$mcV$sp(Function0.scala:39)
               at 
org.apache.spark.util.CompletionIterator$$anon$1.completion(CompletionIterator.scala:47)
               at 
org.apache.spark.shuffle.reader.RssShuffleReader$MultiPartitionIterator.lambda$new$1(RssShuffleReader.java:296)
               at 
org.apache.spark.TaskContextImpl.$anonfun$invokeTaskCompletionListeners$1(TaskContextImpl.scala:144)
               at 
org.apache.spark.TaskContextImpl.$anonfun$invokeTaskCompletionListeners$1$adapted(TaskContextImpl.scala:144)
               at 
org.apache.spark.TaskContextImpl.invokeListeners(TaskContextImpl.scala:199)
               ... 12 more
           Caused by: java.lang.IllegalArgumentException: duplicate or slice
               at jdk.unsupported/sun.misc.Unsafe.invokeCleaner(Unknown Source)
               ... 27 more
   Caused by: java.lang.IllegalArgumentException: duplicate or slice
       at jdk.unsupported/sun.misc.Unsafe.invokeCleaner(Unknown Source)
       ... 42 more
   ```
   
   
   ### Uniffle Server Configurations
   
   ```yaml
   rss.rpc.server.type GRPC_NETTY
   ...
   ```
   
   
   ### Uniffle Engine Configurations
   
   ```yaml
   spark.rss.client.type=GRPC_NETTY
   spark.rss.client.netty.io.mode=EPOLL
   spark.rss.storage.type=MEMORY_LOCALFILE
   ...
   ```
   
   
   ### Additional context
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to