SonixLegend opened a new issue, #5531:
URL: https://github.com/apache/incubator-gluten/issues/5531

   ### Backend
   
   VL (Velox)
   
   ### Bug description
   
   [Expected behavior] and [actual behavior].
   ## Expected behavior
   Start spark job with gluten plugin
   ## Actual behavior
   Loading the plugin library, it happen crash with the libgluten.so
   [Uploading hs_err_pid2751.log…]()
   
   
   ### Spark version
   
   None
   
   ### Spark configurations
   
   ### hadoop version 3.4.0
   ### spark version 3.5.1
   spark.master spark://legend1:7077
   spark.submit.deployMode client
   spark.eventLog.enabled true
   spark.eventLog.dir hdfs://legend1:8020/spark
   spark.serializer org.apache.spark.serializer.KryoSerializer
   spark.kryoserializer.buffer.max 512M
   spark.rpc.message.maxSize 500
   spark.shuffle.service.enabled true
   spark.dynamicAllocation.enabled true
   spark.dynamicAllocation.shuffleTracking.enabled true
   spark.dynamicAllocation.initialExecutors 2
   spark.dynamicAllocation.minExecutors 2
   spark.dynamicAllocation.maxExecutors 2
   spark.default.parallelism 4
   spark.driver.cores 2
   spark.driver.memory 4G
   spark.driver.memoryOverhead 1G
   spark.driver.maxResultSize 2G
   spark.driver.extraJavaOptions -XX:+UseG1GC 
-Djava.library.path=/home/sonix/hadoop-3.4.0/lib/native --add-exports 
java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED 
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED 
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED 
--add-opens=java.base/java.io=ALL-UNNAMED 
--add-opens=java.base/java.net=ALL-UNNAMED 
--add-opens=java.base/java.nio=ALL-UNNAMED 
--add-opens=java.base/java.util=ALL-UNNAMED 
--add-opens=java.base/java.util.concurrent=ALL-UNNAMED 
--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED 
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED 
--add-opens=java.base/sun.nio.cs=ALL-UNNAMED 
--add-opens=java.base/sun.security.action=ALL-UNNAMED 
--add-opens=java.base/sun.util.calendar=ALL-UNNAMED 
--add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
   spark.driver.extraClassPath 
/home/sonix/gluten-velox-bundle-spark3.5_2.12-debian_12_x86_64-1.2.0-SNAPSHOT.jar
   spark.executor.cores 2
   spark.executor.memory 4G
   spark.executor.memoryOverhead 1G
   spark.executor.extraJavaOptions -XX:+UseG1GC 
-Djava.library.path=/home/sonix/hadoop-3.4.0/lib/native --add-exports 
java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED 
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED 
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED 
--add-opens=java.base/java.io=ALL-UNNAMED 
--add-opens=java.base/java.net=ALL-UNNAMED 
--add-opens=java.base/java.nio=ALL-UNNAMED 
--add-opens=java.base/java.util=ALL-UNNAMED 
--add-opens=java.base/java.util.concurrent=ALL-UNNAMED 
--add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED 
--add-opens=java.base/sun.nio.ch=ALL-UNNAMED 
--add-opens=java.base/sun.nio.cs=ALL-UNNAMED 
--add-opens=java.base/sun.security.action=ALL-UNNAMED 
--add-opens=java.base/sun.util.calendar=ALL-UNNAMED 
--add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED
   spark.executor.extraClassPath 
/home/sonix/gluten-velox-bundle-spark3.5_2.12-debian_12_x86_64-1.2.0-SNAPSHOT.jar
   spark.memory.offHeap.enabled true
   spark.memory.offHeap.size 4G
   spark.shuffle.manager org.apache.spark.shuffle.sort.ColumnarShuffleManager
   spark.plugins org.apache.gluten.GlutenPlugin
   spark.sql.execution.arrow.pyspark.enabled true
   spark.sql.execution.arrow.pyspark.fallback.enabled true
   spark.sql.warehouse.dir hdfs://legend1:8020/spark
   spark.history.fs.logDirectory hdfs://legend1:8020/spark
   
   ### System information
   
   Velox System Info v0.0.2
   Commit: 75fcad5aeedff6dce77d4759aed1d22332b2a72d
   CMake Version: 3.25.1
   System: Linux-5.15.90.1-microsoft-standard-WSL2
   Arch: x86_64
   C++ Compiler: /usr/bin/c++
   C++ Compiler Version: 12.2.0
   C Compiler: /usr/bin/cc
   C Compiler Version: 12.2.0
   CMake Prefix Path: /usr/local;/usr;/;/usr;/usr/local;/usr/X11R6;/usr/pkg;/opt
   
   ### Relevant logs
   
   ```bash
   Current thread (0x00007f405802f2a0):  JavaThread "main"             
[_thread_in_native, id=2780, stack(0x00007f405fc21000,0x00007f405fd21000) 
(1024K)]
   
   Stack: [0x00007f405fc21000,0x00007f405fd21000],  sp=0x00007f405fd1c840,  
free space=1006k
   Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native 
code)
   C  [libgluten.so+0x40a946]  google::protobuf::internal::OnShutdownRun(void 
(*)(void const*), void const*)+0x96
   Java frames: (J=compiled Java code, j=interpreted, Vv=VM code)
   j  
jdk.internal.loader.NativeLibraries.load(Ljdk/internal/loader/NativeLibraries$NativeLibraryImpl;Ljava/lang/String;ZZ)Z+0
 [email protected]
   j  jdk.internal.loader.NativeLibraries$NativeLibraryImpl.open()Z+57 
[email protected]
   j  
jdk.internal.loader.NativeLibraries.loadLibrary(Ljava/lang/Class;Ljava/lang/String;Z)Ljdk/internal/loader/NativeLibrary;+254
 [email protected]
   j  
jdk.internal.loader.NativeLibraries.loadLibrary(Ljava/lang/Class;Ljava/io/File;)Ljdk/internal/loader/NativeLibrary;+51
 [email protected]
   j  
java.lang.ClassLoader.loadLibrary(Ljava/lang/Class;Ljava/io/File;)Ljdk/internal/loader/NativeLibrary;+31
 [email protected]
   j  java.lang.Runtime.load0(Ljava/lang/Class;Ljava/lang/String;)V+61 
[email protected]
   j  java.lang.System.load(Ljava/lang/String;)V+7 [email protected]
   j  
org.apache.gluten.vectorized.JniLibLoader.loadFromPath0(Ljava/lang/String;Z)V+27
   j  
org.apache.gluten.vectorized.JniLibLoader$JniLoadTransaction.loadWithLink(Ljava/lang/String;Lorg/apache/gluten/vectorized/JniLibLoader$LoadAction;)V+13
   j  
org.apache.gluten.vectorized.JniLibLoader$JniLoadTransaction.lambda$commit$1(Lorg/apache/gluten/vectorized/JniLibLoader$LoadAction;)V+23
   j  
org.apache.gluten.vectorized.JniLibLoader$JniLoadTransaction$$Lambda+0x00007f3fe8548650.accept(Ljava/lang/Object;)V+8
   j  java.util.ArrayList.forEach(Ljava/util/function/Consumer;)V+46 
[email protected]
   j  org.apache.gluten.vectorized.JniLibLoader$JniLoadTransaction.commit()V+46
   j  
org.apache.gluten.vectorized.JniLibLoader.mapAndLoad(Ljava/lang/String;Z)V+9
   j  
org.apache.gluten.backendsapi.velox.VeloxListenerApi.initialize(Lorg/apache/spark/SparkConf;)V+170
   j  
org.apache.gluten.backendsapi.velox.VeloxListenerApi.onDriverStart(Lorg/apache/spark/SparkConf;)V+49
   j  
org.apache.gluten.GlutenDriverPlugin.init(Lorg/apache/spark/SparkContext;Lorg/apache/spark/api/plugin/PluginContext;)Ljava/util/Map;+74
   j  
org.apache.spark.internal.plugin.DriverPluginContainer.$anonfun$driverPlugins$1(Lorg/apache/spark/internal/plugin/DriverPluginContainer;Lorg/apache/spark/api/plugin/SparkPlugin;)Lscala/collection/Iterable;+77
   j  
org.apache.spark.internal.plugin.DriverPluginContainer$$Lambda+0x00007f3fe84e24d0.apply(Ljava/lang/Object;)Ljava/lang/Object;+8
   j  
scala.collection.TraversableLike.$anonfun$flatMap$1(Lscala/collection/mutable/Builder;Lscala/Function1;Ljava/lang/Object;)Lscala/collection/mutable/Builder;+3
   j  
scala.collection.TraversableLike$$Lambda+0x00007f3fe828dfd0.apply(Ljava/lang/Object;)Ljava/lang/Object;+9
   j  scala.collection.mutable.ResizableArray.foreach(Lscala/Function1;)V+23
   j  
scala.collection.mutable.ResizableArray.foreach$(Lscala/collection/mutable/ResizableArray;Lscala/Function1;)V+2
   j  scala.collection.mutable.ArrayBuffer.foreach(Lscala/Function1;)V+2
   j  
scala.collection.TraversableLike.flatMap(Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;+14
   j  
scala.collection.TraversableLike.flatMap$(Lscala/collection/TraversableLike;Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;+3
   j  
scala.collection.AbstractTraversable.flatMap(Lscala/Function1;Lscala/collection/generic/CanBuildFrom;)Ljava/lang/Object;+3
   j  
org.apache.spark.internal.plugin.DriverPluginContainer.<init>(Lorg/apache/spark/SparkContext;Ljava/util/Map;Lscala/collection/Seq;)V+32
   j  
org.apache.spark.internal.plugin.PluginContainer$.apply(Lscala/util/Either;Ljava/util/Map;)Lscala/Option;+104
   j  
org.apache.spark.internal.plugin.PluginContainer$.apply(Lorg/apache/spark/SparkContext;Ljava/util/Map;)Lscala/Option;+10
   j  org.apache.spark.SparkContext.<init>(Lorg/apache/spark/SparkConf;)V+1775
   j  
org.apache.spark.SparkContext$.getOrCreate(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;+23
   j  
org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(Lorg/apache/spark/SparkConf;)Lorg/apache/spark/SparkContext;+30
   j  
org.apache.spark.sql.SparkSession$Builder$$Lambda+0x00007f3fe82fd520.apply()Ljava/lang/Object;+4
   j  scala.Option.getOrElse(Lscala/Function0;)Ljava/lang/Object;+8
   j  
org.apache.spark.sql.SparkSession$Builder.getOrCreate()Lorg/apache/spark/sql/SparkSession;+184
   j  org.apache.spark.examples.SparkPi$.main([Ljava/lang/String;)V+11
   j  org.apache.spark.examples.SparkPi.main([Ljava/lang/String;)V+4
   v  ~StubRoutines::call_stub 0x00007f4047f37cc4
   j  
jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Ljava/lang/reflect/Method;Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+0
 [email protected]
   j  
jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+126
 [email protected]
   j  
jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+6
 [email protected]
   j  
java.lang.reflect.Method.invoke(Ljava/lang/Object;[Ljava/lang/Object;)Ljava/lang/Object;+102
 [email protected]
   j  
org.apache.spark.deploy.JavaMainApplication.start([Ljava/lang/String;Lorg/apache/spark/SparkConf;)V+97
   j  
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(Lorg/apache/spark/deploy/SparkSubmitArguments;Z)V+561
   j  
org.apache.spark.deploy.SparkSubmit.doRunMain$1(Lorg/apache/spark/deploy/SparkSubmitArguments;Z)V+207
   j  
org.apache.spark.deploy.SparkSubmit.submit(Lorg/apache/spark/deploy/SparkSubmitArguments;Z)V+65
   j  org.apache.spark.deploy.SparkSubmit.doSubmit([Ljava/lang/String;)V+78
   j  
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit([Ljava/lang/String;)V+2
   j  org.apache.spark.deploy.SparkSubmit$.main([Ljava/lang/String;)V+29
   j  org.apache.spark.deploy.SparkSubmit.main([Ljava/lang/String;)V+4
   v  ~StubRoutines::call_stub 0x00007f4047f37cc4
   ```
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to