jatin510 commented on issue #1158:
URL: 
https://github.com/apache/datafusion-comet/issues/1158#issuecomment-2532384775

   thanks @andygrove  for your help
   
   Running this command: `make release PROFILES="-Pspark-3.5 -Pscala-2.13"`
   gave output: 
   ```
   [INFO] 
------------------------------------------------------------------------
   [INFO] Reactor Summary for Comet Project Parent POM 0.5.0-SNAPSHOT:
   [INFO]
   [INFO] Comet Project Parent POM ........................... SUCCESS [  4.826 
s]
   [INFO] comet-common ....................................... SUCCESS [ 11.075 
s]
   [INFO] comet-spark ........................................ SUCCESS [ 13.057 
s]
   [INFO] comet-spark-integration ............................ SUCCESS [  1.471 
s]
   [INFO] comet-fuzz ......................................... SUCCESS [  2.937 
s]
   [INFO] 
------------------------------------------------------------------------
   [INFO] BUILD SUCCESS
   [INFO] 
------------------------------------------------------------------------
   [INFO] Total time:  33.446 s
   [INFO] Finished at: 2024-12-10T23:01:48+05:30
   ```
   
   
   and i am using this conf :
   
   ```
   export 
SPARK_HOME=/Users/jatin/Documents/open-source/spark-3.5.3-bin-hadoop3-scala2.13
   export COMET_HOME=/Users/jatin/Documents/open-source/datafusion-comet
   export 
COMET_JAR=$COMET_HOME/spark/target/comet-spark-spark3.5_2.13-0.5.0-SNAPSHOT.jar
   export JAVA_HOME=`/usr/libexec/java_home`
   
   spark() {
       MODE=$1 # Accept 'shell' or 'sql' as the first argument
       ENABLE_COMET=$2 # Optional second argument for enabling COMET config
   
       if [[ "$MODE" != "shell" && "$MODE" != "sql" ]]; then
           echo "Usage: spark <shell|sql> [comet]"
           return 1
       fi
   
       # Base command
       CMD="$SPARK_HOME/bin/spark-$MODE"
   
       if [[ "$ENABLE_COMET" == "comet" ]]; then
           CMD+=" \
               --jars $COMET_JAR \
               --conf spark.driver.extraClassPath=$COMET_JAR \
               --conf spark.executor.extraClassPath=$COMET_JAR \
               --conf spark.plugins=org.apache.spark.CometPlugin \
               --conf 
spark.shuffle.manager=org.apache.spark.sql.comet.execution.shuffle.CometShuffleManager
 \
               --conf spark.comet.explainFallback.enabled=true \
               --conf spark.memory.offHeap.enabled=true \
               --conf spark.memory.offHeap.size=16g"
       fi
       # Execute the command
       eval $CMD
   #   echo $CMD
   }
   
   ```
   
   when i am running, comet version:
   
   `spark sql comet`
   and running:
   `select 1 + 1;`
   
   it is giving error 
   
   ```
   4/12/10 23:06:06 WARN NativeBase: Failed to load comet library
   java.lang.UnsatisfiedLinkError: 
/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib:
 
dlopen(/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib,
 0x0001): tried: 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64')), 
'/System/Volumes/Preboot/Cryptexes/OS/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (no such file), 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64'))
        at java.base/jdk.internal.loader.NativeLibraries.load(Native Method)
        at 
java.base/jdk.internal.loader.NativeLibraries$NativeLibraryImpl.open(NativeLibraries.java:388)
        at 
java.base/jdk.internal.loader.NativeLibraries.loadLibrary(NativeLibraries.java:232)
        at 
java.base/jdk.internal.loader.NativeLibraries.loadLibrary(NativeLibraries.java:174)
        at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2394)
        at java.base/java.lang.Runtime.load0(Runtime.java:755)
        at java.base/java.lang.System.load(System.java:1957)
        at org.apache.comet.NativeBase.bundleLoadLibrary(NativeBase.java:129)
        at org.apache.comet.NativeBase.load(NativeBase.java:97)
        at org.apache.comet.NativeBase.<clinit>(NativeBase.java:54)
        at 
org.apache.comet.CometSparkSessionExtensions$.isCometEnabled(CometSparkSessionExtensions.scala:1141)
        at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:94)
        at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:92)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1(Columnar.scala:530)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1$adapted(Columnar.scala:530)
        at scala.collection.immutable.List.foreach(List.scala:333)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:530)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:482)
        at 
org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:477)
        at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:169)
        at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:165)
        at scala.collection.immutable.List.foldLeft(List.scala:79)
        at 
org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:476)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:186)
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:138)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:219)
        at 
org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:546)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:219)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:218)
        at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:186)
        at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:179)
        at 
org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:238)
        at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:284)
        at 
org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:252)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:117)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:76)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:501)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:619)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:613)
        at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
        at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:926)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:613)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:310)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:568)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1029)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Failed to load comet library: 
/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib:
 
dlopen(/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib,
 0x0001): tried: 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64')), 
'/System/Volumes/Preboot/Cryptexes/OS/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (no such file), 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64'))
   24/12/10 23:06:06 WARN CometSparkSessionExtensions: Comet extension is 
disabled because of error when loading native lib. Falling back to Spark
   java.lang.UnsatisfiedLinkError: 
/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib:
 
dlopen(/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib,
 0x0001): tried: 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64')), 
'/System/Volumes/Preboot/Cryptexes/OS/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (no such file), 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64'))
        at java.base/jdk.internal.loader.NativeLibraries.load(Native Method)
        at 
java.base/jdk.internal.loader.NativeLibraries$NativeLibraryImpl.open(NativeLibraries.java:388)
        at 
java.base/jdk.internal.loader.NativeLibraries.loadLibrary(NativeLibraries.java:232)
        at 
java.base/jdk.internal.loader.NativeLibraries.loadLibrary(NativeLibraries.java:174)
        at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2394)
        at java.base/java.lang.Runtime.load0(Runtime.java:755)
        at java.base/java.lang.System.load(System.java:1957)
        at org.apache.comet.NativeBase.bundleLoadLibrary(NativeBase.java:129)
        at org.apache.comet.NativeBase.load(NativeBase.java:97)
        at org.apache.comet.NativeBase.<clinit>(NativeBase.java:54)
        at 
org.apache.comet.CometSparkSessionExtensions$.isCometEnabled(CometSparkSessionExtensions.scala:1141)
        at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:94)
        at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:92)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1(Columnar.scala:530)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1$adapted(Columnar.scala:530)
        at scala.collection.immutable.List.foreach(List.scala:333)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:530)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:482)
        at 
org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:477)
        at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:169)
        at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:165)
        at scala.collection.immutable.List.foldLeft(List.scala:79)
        at 
org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:476)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:186)
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:138)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:219)
        at 
org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:546)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:219)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:218)
        at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:186)
        at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:179)
        at 
org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:238)
        at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:284)
        at 
org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:252)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:117)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:76)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:501)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:619)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:613)
        at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
        at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:926)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:613)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:310)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:568)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1029)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   24/12/10 23:06:06 WARN CometSparkSessionExtensions: Comet extension is 
disabled because of error when loading native lib. Falling back to Spark
   java.lang.UnsatisfiedLinkError: 
/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib:
 
dlopen(/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib,
 0x0001): tried: 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64')), 
'/System/Volumes/Preboot/Cryptexes/OS/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (no such file), 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64'))
        at java.base/jdk.internal.loader.NativeLibraries.load(Native Method)
        at 
java.base/jdk.internal.loader.NativeLibraries$NativeLibraryImpl.open(NativeLibraries.java:388)
        at 
java.base/jdk.internal.loader.NativeLibraries.loadLibrary(NativeLibraries.java:232)
        at 
java.base/jdk.internal.loader.NativeLibraries.loadLibrary(NativeLibraries.java:174)
        at java.base/java.lang.ClassLoader.loadLibrary(ClassLoader.java:2394)
        at java.base/java.lang.Runtime.load0(Runtime.java:755)
        at java.base/java.lang.System.load(System.java:1957)
        at org.apache.comet.NativeBase.bundleLoadLibrary(NativeBase.java:129)
        at org.apache.comet.NativeBase.load(NativeBase.java:97)
        at org.apache.comet.NativeBase.<clinit>(NativeBase.java:54)
        at 
org.apache.comet.CometSparkSessionExtensions$.isCometEnabled(CometSparkSessionExtensions.scala:1141)
        at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:94)
        at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:92)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1(Columnar.scala:530)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1$adapted(Columnar.scala:530)
        at scala.collection.immutable.List.foreach(List.scala:333)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:530)
        at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:482)
        at 
org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:477)
        at scala.collection.LinearSeqOps.foldLeft(LinearSeq.scala:169)
        at scala.collection.LinearSeqOps.foldLeft$(LinearSeq.scala:165)
        at scala.collection.immutable.List.foldLeft(List.scala:79)
        at 
org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:476)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:186)
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:138)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:219)
        at 
org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:546)
        at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:219)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:218)
        at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:186)
        at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:179)
        at 
org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:238)
        at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:284)
        at 
org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:252)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:117)
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201)
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108)
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900)
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:76)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:501)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:619)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:613)
        at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:563)
        at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:561)
        at scala.collection.AbstractIterable.foreach(Iterable.scala:926)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:613)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:310)
        at 
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
        at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.base/java.lang.reflect.Method.invoke(Method.java:568)
        at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1029)
        at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
        at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   24/12/10 23:06:06 WARN CometSparkSessionExtensions: Comet extension is 
disabled because of error when loading native lib. Falling back to Spark
   java.lang.UnsatisfiedLinkError: 
/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib:
 
dlopen(/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib,
 0x0001): tried: 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64')), 
'/System/Volumes/Preboot/Cryptexes/OS/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (no such file), 
'/private/var/folders/w_/tskg2p0s0mq50q5whsb53nn80000gn/T/libcomet-809447701100035122.dylib'
 (mach-o file, but is an incompatible architecture (have 'x86_64', need 
'arm64e' or 'arm64'))
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to