zhztheplayer opened a new issue, #6411:
URL: https://github.com/apache/incubator-gluten/issues/6411
### Problem description
Dynamic build against Velox backend is now broken for several issues.
1. `ld` failure
https://github.com/apache/incubator-gluten/pull/6231#issuecomment-2205284650
2. Crash when using parquet write
```
Stack: [0x00007f02a1c6e000,0x00007f02a1d6f000], sp=0x00007f02a1d6b5a0,
free space=1013k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native
code)
C [libc.so.6+0x9a6f0] cfree+0x20
Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
j
org.apache.gluten.vectorized.ColumnarBatchOutIterator.nativeHasNext(J)Z+0
j
org.apache.gluten.vectorized.ColumnarBatchOutIterator.hasNextInternal()Z+5
j org.apache.gluten.vectorized.GeneralOutIterator.hasNext()Z+1
J 9874 C2
org.apache.gluten.utils.iterator.IteratorsV1$IteratorCompleter.hasNext()Z (23
bytes) @ 0x00007f04e2a40310 [0x00007f04e2a40140+0x1d0]
j org.apache.gluten.utils.iterator.IteratorsV1$PayloadCloser.hasNext()Z+8
j
org.apache.gluten.utils.iterator.IteratorsV1$LifeTimeAccumulator.hasNext()Z+4
j
org.apache.spark.sql.execution.VeloxColumnarWriteFilesRDD.$anonfun$compute$2(Lorg/apache/spark/sql/execution/VeloxColumnarWriteFilesRDD;Ljava/lang/String;Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;Lscala/runtime/ObjectRef;Lorg/apache/spark/sql/execution/SparkWriteFilesCommitProtocol;)V+36
j
org.apache.spark.sql.execution.VeloxColumnarWriteFilesRDD$$Lambda$3743.apply$mcV$sp()V+24
J 6450 C1 scala.runtime.java8.JFunction0$mcV$sp.apply()Ljava/lang/Object;
(10 bytes) @ 0x00007f04e193264c [0x00007f04e1932540+0x10c]
j
org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Lscala/Function0;Lscala/Function0;Lscala/Function0;)Ljava/lang/Object;+4
j
org.apache.spark.sql.execution.VeloxColumnarWriteFilesRDD.compute(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;+94
j
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;+24
j
org.apache.spark.rdd.RDD.iterator(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;+40
j
org.apache.spark.scheduler.ResultTask.runTask(Lorg/apache/spark/TaskContext;)Ljava/lang/Object;+201
j
org.apache.spark.TaskContext.runTaskWithListeners(Lorg/apache/spark/scheduler/Task;)Ljava/lang/Object;+2
j
org.apache.spark.scheduler.Task.run(JILorg/apache/spark/metrics/MetricsSystem;ILscala/collection/immutable/Map;Lscala/Option;)Ljava/lang/Object;+254
j
org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Lorg/apache/spark/executor/Executor$TaskRunner;Lscala/runtime/BooleanRef;)Ljava/lang/Object;+43
j
org.apache.spark.executor.Executor$TaskRunner$$Lambda$1599.apply()Ljava/lang/Object;+8
j
org.apache.spark.util.Utils$.tryWithSafeFinally(Lscala/Function0;Lscala/Function0;)Ljava/lang/Object;+4
j org.apache.spark.executor.Executor$TaskRunner.run()V+457
j
java.util.concurrent.ThreadPoolExecutor.runWorker(Ljava/util/concurrent/ThreadPoolExecutor$Worker;)V+95
j java.util.concurrent.ThreadPoolExecutor$Worker.run()V+5
j java.lang.Thread.run()V+11
v ~StubRoutines::call_stub
```
### System information
```
Velox System Info v0.0.2
Commit: e2ad9148b7dfa114a1bbb72d94db0d2f8e5cb26e
CMake Version: 3.29.2
System: Linux-5.4.0-156-generic
Arch: x86_64
CPU Name: Model name: Intel(R) Xeon(R) Platinum 8280
CPU @ 2.70GHz
C++ Compiler: /usr/bin/c++
C++ Compiler Version: 9.4.0
C Compiler: /usr/bin/cc
C Compiler Version: 12.0.0
CMake Prefix Path:
/usr/local;/usr;/;/opt/cmake;/usr/local;/usr/X11R6;/usr/pkg;/opt
```
### CMake log
_No response_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]