zelda89 opened a new issue, #773:
URL: https://github.com/apache/datafusion-comet/issues/773

   ### Describe the bug
   
   After building from source when I try out the example on the installation 
page to test library initialization I get the following error. spark 3.4, scala 
2.12, java 11
   ```
   WARN CometSparkSessionExtensions: Comet extension is disabled because of 
error when loading native lib. Falling back to Spark
   java.lang.ExceptionInInitializerError
           at 
org.apache.comet.CometSparkSessionExtensions$.isCometEnabled(CometSparkSessionExtensions.scala:1013)
           at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:86)
           at 
org.apache.comet.CometSparkSessionExtensions$CometScanRule.apply(CometSparkSessionExtensions.scala:84)
           at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1(Columnar.scala:564)
           at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$apply$1$adapted(Columnar.scala:564)
           at 
scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
           at 
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
           at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
           at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:564)
           at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:516)
           at 
org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:457)
           at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
           at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
           at scala.collection.immutable.List.foldLeft(List.scala:91)
           at 
org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:456)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:175)
           at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:202)
           at 
org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:526)
           at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:202)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
           at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:201)
           at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:175)
           at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:168)
           at 
org.apache.spark.sql.execution.QueryExecution.simpleString(QueryExecution.scala:221)
           at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$explainString(QueryExecution.scala:266)
           at 
org.apache.spark.sql.execution.QueryExecution.explainString(QueryExecution.scala:235)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:112)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:195)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:103)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:65)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98)
           at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:94)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:512)
           at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:104)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:512)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:31)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
           at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
           at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:31)
           at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:488)
           at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:94)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:81)
           at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:79)
           at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:133)
           at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:856)
           at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:387)
           at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:360)
           at 
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239)
           at 
org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:789)
           at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:23)
           at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:27)
           at $line14.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(<console>:29)
           at $line14.$read$$iw$$iw$$iw$$iw$$iw.<init>(<console>:31)
           at $line14.$read$$iw$$iw$$iw$$iw.<init>(<console>:33)
           at $line14.$read$$iw$$iw$$iw.<init>(<console>:35)
           at $line14.$read$$iw$$iw.<init>(<console>:37)
           at $line14.$read$$iw.<init>(<console>:39)
           at $line14.$read.<init>(<console>:41)
           at $line14.$read$.<init>(<console>:45)
           at $line14.$read$.<clinit>(<console>)
           at $line14.$eval$.$print$lzycompute(<console>:7)
           at $line14.$eval$.$print(<console>:6)
           at $line14.$eval.$print(<console>)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.base/java.lang.reflect.Method.invoke(Method.java:566)
           at 
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:747)
           at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1020)
           at 
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:568)
           at 
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:36)
           at 
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:116)
           at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
           at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:567)
           at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:594)
           at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:564)
           at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:865)
           at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:733)
           at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:435)
           at scala.tools.nsc.interpreter.ILoop.loop(ILoop.scala:456)
           at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:239)
           at org.apache.spark.repl.Main$.doMain(Main.scala:78)
           at org.apache.spark.repl.Main$.main(Main.scala:58)
           at org.apache.spark.repl.Main.main(Main.scala)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.base/java.lang.reflect.Method.invoke(Method.java:566)
           at 
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
           at 
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:1020)
           at 
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:192)
           at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:215)
           at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
           at 
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1111)
           at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1120)
           at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
   Caused by: java.lang.UnsupportedOperationException: Unsupported OS/arch, 
cannot find /org/apache/comet/linux/amd64/libcomet.so. Please try building from 
source.
           at org.apache.comet.NativeBase.bundleLoadLibrary(NativeBase.java:105)
           at org.apache.comet.NativeBase.load(NativeBase.java:88)
           at org.apache.comet.NativeBase.<clinit>(NativeBase.java:53)
           ... 99 more
   ```
   
   ### Steps to reproduce
   
   _No response_
   
   ### Expected behavior
   
   _No response_
   
   ### Additional context
   
   _No response_


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org
For additional commands, e-mail: github-h...@datafusion.apache.org

Reply via email to