gaurav7261 opened a new issue, #9415:
URL: https://github.com/apache/incubator-gluten/issues/9415

   ### Backend
   
   VL (Velox)
   
   ### Bug description
   
   ```
   25/04/24 09:51:01 INFO GlutenDriverPlugin: Gluten SQL Tab has attached.
   25/04/24 09:51:02 INFO DatabricksILoop$: Successfully registered spark 
metrics in Prometheus registry
   25/04/24 09:51:02 INFO DatabricksILoop$: Successfully initialized 
SparkContext
   25/04/24 09:51:02 INFO StandaloneAppClient$ClientEndpoint: Executor updated: 
app-20250424095101-0000/0 is now RUNNING
   25/04/24 09:51:03 ERROR DriverDaemon$: XXX Fatal uncaught exception. 
Terminating driver.
   java.lang.VerifyError: class 
org.apache.spark.sql.catalyst.expressions.PreComputeRangeFrameBound overrides 
final method 
genCode.(Lorg/apache/spark/sql/catalyst/expressions/codegen/CodegenContext;)Lorg/apache/spark/sql/catalyst/expressions/codegen/ExprCode;
        at java.lang.ClassLoader.defineClass1(Native Method)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:757)
        at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
        at 
org.apache.gluten.backendsapi.velox.VeloxRuleApi$.org$apache$gluten$backendsapi$velox$VeloxRuleApi$$injectLegacy(VeloxRuleApi.scala:83)
        at 
org.apache.gluten.backendsapi.velox.VeloxRuleApi.injectRules(VeloxRuleApi.scala:51)
        at 
org.apache.gluten.backendsapi.SubstraitBackend.injectRules(SubstraitBackend.scala:39)
        at 
org.apache.gluten.backendsapi.SubstraitBackend.injectRules$(SubstraitBackend.scala:38)
        at 
org.apache.gluten.backendsapi.velox.VeloxBackend.injectRules(VeloxBackend.scala:58)
        at 
org.apache.gluten.extension.GlutenSessionExtensions.$anonfun$apply$6(GlutenSessionExtensions.scala:51)
        at 
org.apache.gluten.extension.GlutenSessionExtensions.$anonfun$apply$6$adapted(GlutenSessionExtensions.scala:51)
        at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
        at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
        at 
org.apache.gluten.extension.GlutenSessionExtensions.apply(GlutenSessionExtensions.scala:51)
        at 
org.apache.gluten.extension.GlutenSessionExtensions.apply(GlutenSessionExtensions.scala:26)
        at 
org.apache.spark.sql.SparkSession$.$anonfun$applyExtensions$2(SparkSession.scala:1695)
        at 
org.apache.spark.sql.SparkSession$.$anonfun$applyExtensions$2$adapted(SparkSession.scala:1690)
        at scala.collection.Iterator.foreach(Iterator.scala:943)
        at scala.collection.Iterator.foreach$(Iterator.scala:943)
        at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
        at scala.collection.IterableLike.foreach(IterableLike.scala:74)
        at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
        at 
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$applyExtensions(SparkSession.scala:1690)
        at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1420)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:38)
        at 
com.databricks.backend.daemon.driver.DatabricksILoop$.initializeSharedDriverContext(DatabricksILoop.scala:517)
        at 
com.databricks.backend.daemon.driver.DatabricksILoop$.getOrCreateSharedDriverContext(DatabricksILoop.scala:310)
        at 
com.databricks.backend.daemon.driver.DriverCorral.driverContext(DriverCorral.scala:381)
        at 
com.databricks.backend.daemon.driver.DriverCorral.<init>(DriverCorral.scala:221)
        at 
com.databricks.backend.daemon.driver.DriverDaemon.<init>(DriverDaemon.scala:75)
        at 
com.databricks.backend.daemon.driver.DriverDaemon$.create(DriverDaemon.scala:631)
        at 
com.databricks.backend.daemon.driver.DriverDaemon$.initialize(DriverDaemon.scala:779)
        at 
com.databricks.backend.daemon.driver.DriverDaemon$.wrappedMain(DriverDaemon.scala:744)
        at 
com.databricks.DatabricksMain.$anonfun$main$4(DatabricksMain.scala:230)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at 
com.databricks.DatabricksMain.$anonfun$withStartupProfilingData$1(DatabricksMain.scala:655)
        at 
com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:528)
        at 
com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:633)
        at 
com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:656)
        at 
com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48)
        at 
com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:276)
        at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
        at 
com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:272)
        at 
com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46)
        at 
com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43)
        at 
com.databricks.DatabricksMain.withAttributionContext(DatabricksMain.scala:110)
        at 
com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95)
        at 
com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76)
        at 
com.databricks.DatabricksMain.withAttributionTags(DatabricksMain.scala:110)
        at 
com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:628)
        at 
com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:537)
        at 
com.databricks.DatabricksMain.recordOperationWithResultTags(DatabricksMain.scala:110)
        at 
com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:529)
        at 
com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:495)
        at 
com.databricks.DatabricksMain.recordOperation(DatabricksMain.scala:110)
        at 
com.databricks.DatabricksMain.withStartupProfilingData(DatabricksMain.scala:654)
        at 
com.databricks.DatabricksMain.$anonfun$main$3(DatabricksMain.scala:229)
        at 
scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at 
com.databricks.context.integrity.IntegrityCheckContext$ThreadLocalStorage$.withValue(IntegrityCheckContext.scala:73)
        at com.databricks.DatabricksMain.main(DatabricksMain.scala:229)
        at 
com.databricks.backend.daemon.driver.DriverDaemon.main(DriverDaemon.scala)
   25/04/24 09:51:03 INFO DrainingState: Started draining: min wait 10000, 
grace period 5000, max wait 15000.
   25/04/24 09:51:08 INFO DrainingState: Grace period finished
   25/04/24 09:51:08 INFO DrainingState: Drain complete, exiting now.
   ```
   
   ### Gluten version
   
   Gluten-1.3
   
   ### Spark version
   
   Spark-3.5.x
   
   ### Spark configurations
   
   DBR Runtime: 15.4 LTS (Scala 2.12, Spark 3.5.0)
   Used JAR gluten-velox-bundle-spark3.5_2.12-centos_7_x86_64-1.3.0.jar
   via init script in databricks: 
   ```cat gluten_script.sh
   #!/bin/bash
   
   STAGE_DIR="/dbfs/FileStore/databricks/plugins"
   echo "BEGIN: Upload Spark Plugins"
   
   # Copy to Databricks jars directory directly (works across all cloud 
providers)
   cp -f $STAGE_DIR/*.jar /databricks/jars/ || { echo "Error copying Spark 
Plugin library file"; exit 1;}```
   
   ### System information
   
   25/04/24 09:50:53 INFO GlutenDriverPlugin: Gluten build info:
   ==============================================================
   Components: Velox
   Component Velox Branch: HEAD
   Component Velox Revision: 6af292c63804fb79e8b0da6d69ca265833148e8a
   Component Velox Revision Time: 2025-01-07 08:54:33 +0800
   Gluten Version: 1.3.0
   GCC Version: GCC: (GNU) 11.2.1 20220127 (Red Hat 11.2.1-9)
   Java Version: 1.8
   Scala Version: 2.12.15
   Spark Version: 3.5.2
   Hadoop Version: 3.3.4
   Gluten Branch: preparing_1.3.0
   Gluten Revision: 0e85219691c3e5737846d079d3b5f3852e104123
   Gluten Revision Time: 2025-01-16 14:00:15 +0000
   Gluten Build Time: 2025-01-16T14:35:50Z
   Gluten Repo URL: 
https://REDACTED_CREDENTIALS(60a35f9e)@github.com/weiting-chen/gluten.git
   ==============================================================
   
   ### Relevant logs
   
   ```bash
   
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to