LuciferYang commented on PR #48401:
URL: https://github.com/apache/spark/pull/48401#issuecomment-2504353098

   After this PR was merged, Maven daily test started to fail:
   - Java 17: 
https://github.com/apache/spark/actions/runs/12031263481/job/33540406301
   - Java 21: 
https://github.com/apache/spark/actions/runs/12032230969/job/33543580253
   
   
![image](https://github.com/user-attachments/assets/992bacb5-1601-489c-81f7-ce17d0911a40)
   
   
   We can use the following method to confirm that this PR has caused similar 
test failures:
   
   - Before this pr:
   
   ```
   git reset --hard f7122137006e941393c8be619fb51b3b713a24cb // before this 
one: [SPARK-50415][BUILD] Upgrade `zstd-jni` to 1.5.6-8
   build/mvn clean install -DskipTests -pl sql/core -am
   build/mvn test -pl sql/core -DwildcardSuites=none 
-Dtest=test.org.apache.spark.sql.JavaDatasetSuite
   
   [INFO] Running test.org.apache.spark.sql.JavaDatasetSuite
   00:29:17.513 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   
   [INFO] Tests run: 47, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 
11.54 s -- in test.org.apache.spark.sql.JavaDatasetSuite
   [INFO] 
   [INFO] Results:
   [INFO] 
   [INFO] Tests run: 47, Failures: 0, Errors: 0, Skipped: 0
   ```
   
   - After this pr:
   
   ```
   git reset --hard 69d433bcfd5a2d69f3cd7f8c4e310a3b5854fc74 // 
[SPARK-50387][SS] Update condition for timer expiry and relevant test
   build/mvn clean install -DskipTests -pl sql/core -am
   build/mvn test -pl sql/core -DwildcardSuites=none 
-Dtest=test.org.apache.spark.sql.JavaDatasetSuite
   
   [INFO] Running test.org.apache.spark.sql.JavaDatasetSuite
   00:40:09.702 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
   
   00:40:16.384 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 
in stage 4.0 (TID 3)
   java.lang.NoSuchMethodError: 'void 
org.apache.spark.util.NonFateSharingCache.<init>(com.google.common.cache.Cache)'
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$.<clinit>(RocksDBStateStoreProvider.scala:623)
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndProcessData(TransformWithStateExec.scala:636)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$1(TransformWithStateExec.scala:571)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$1$adapted(TransformWithStateExec.scala:549)
        at 
org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper$StateStoreAwareZipPartitionsRDD.compute(StreamingSymmetricHashJoinHelper.scala:295)
   00:40:16.384 ERROR org.apache.spark.executor.Executor: Exception in task 3.0 
in stage 4.0 (TID 4)
   java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndProcessData(TransformWithStateExec.scala:636)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$1(TransformWithStateExec.scala:571)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$1$adapted(TransformWithStateExec.scala:549)
        at 
org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper$StateStoreAwareZipPartitionsRDD.compute(StreamingSymmetricHashJoinHelper.scala:295)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:374)
   00:40:16.393 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 1.0 
in stage 4.0 (TID 3) (localhost executor driver): java.lang.NoSuchMethodError: 
'void 
org.apache.spark.util.NonFateSharingCache.<init>(com.google.common.cache.Cache)'
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$.<clinit>(RocksDBStateStoreProvider.scala:623)
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(Stat...
   
   00:40:16.394 ERROR org.apache.spark.scheduler.TaskSetManager: Task 1 in 
stage 4.0 failed 1 times; aborting job
   
   00:40:16.394 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 3.0 
in stage 4.0 (TID 4) (localhost executor driver): 
java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndP...
   
   00:40:16.399 ERROR org.apache.spark.executor.Executor: Exception in task 4.0 
in stage 4.0 (TID 5)
   java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndProcessData(TransformWithStateExec.scala:636)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$1(TransformWithStateExec.scala:571)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$1$adapted(TransformWithStateExec.scala:549)
        at 
org.apache.spark.sql.execution.streaming.StreamingSymmetricHashJoinHelper$StateStoreAwareZipPartitionsRDD.compute(StreamingSymmetricHashJoinHelper.scala:295)
        at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:374)
   00:40:20.421 ERROR org.apache.spark.executor.Executor: Exception in task 3.0 
in stage 2.0 (TID 3)
   java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndProcessData(TransformWithStateExec.scala:636)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$5(TransformWithStateExec.scala:597)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$5$adapted(TransformWithStateExec.scala:596)
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:918)
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:918)
   00:40:20.421 ERROR org.apache.spark.executor.Executor: Exception in task 1.0 
in stage 2.0 (TID 2)
   java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndProcessData(TransformWithStateExec.scala:636)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$5(TransformWithStateExec.scala:597)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.$anonfun$doExecute$5$adapted(TransformWithStateExec.scala:596)
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:918)
        at 
org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:918)
   00:40:20.422 WARN org.apache.spark.scheduler.TaskSetManager: Lost task 3.0 
in stage 2.0 (TID 3) (localhost executor driver): 
java.lang.NoClassDefFoundError: Could not initialize class 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider$
        at 
org.apache.spark.sql.execution.streaming.state.RocksDBStateStoreProvider.init(RocksDBStateStoreProvider.scala:393)
        at 
org.apache.spark.sql.execution.streaming.state.StateStoreProvider$.createAndInit(StateStore.scala:499)
        at 
org.apache.spark.sql.execution.streaming.TransformWithStateExec.initNewStateStoreAndP...
   
   00:40:20.422 ERROR org.apache.spark.scheduler.TaskSetManager: Task 3 in 
stage 2.0 failed 1 times; aborting job
   
   00:40:20.423 ERROR org.apache.spark.scheduler.TaskSchedulerImpl: Exception 
in statusUpdate
   java.util.concurrent.RejectedExecutionException: Task 
org.apache.spark.scheduler.TaskResultGetter$$Lambda$4365/0x000000700221fb70@74f74897
 rejected from java.util.concurrent.ThreadPoolExecutor@36d4df95[Terminated, 
pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 4]
        at 
java.base/java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2065)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:833)
        at 
java.base/java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1365)
        at 
org.apache.spark.scheduler.TaskResultGetter.enqueueFailedTask(TaskResultGetter.scala:140)
        at 
org.apache.spark.scheduler.TaskSchedulerImpl.liftedTree2$1(TaskSchedulerImpl.scala:813)
        at 
org.apache.spark.scheduler.TaskSchedulerImpl.statusUpdate(TaskSchedulerImpl.scala:786)
        at 
org.apache.spark.scheduler.local.LocalEndpoint$$anonfun$receive$1.applyOrElse(LocalSchedulerBackend.scala:73)
   [ERROR] Tests run: 47, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 
11.15 s <<< FAILURE! -- in test.org.apache.spark.sql.JavaDatasetSuite
   ```
   
   
   @ericm-db Could you help fix the issue mentioned above?
   also cc @HeartSaVioR 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to