gaoyangxiaozhu opened a new issue, #5116:
URL: https://github.com/apache/incubator-gluten/issues/5116
### Problem description
if build spark 3.5 with ut enabled as below example:
```
```
i would fail with 2 error:
first error is
```
VeloxPartitionedTableTPCHSuite:
*** RUN ABORTED ***
java.lang.NoClassDefFoundError:
org/antlr/v4/runtime/misc/ParseCancellationException
at
org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser$lzycompute(BaseSessionStateBuilder.scala:138)
at
org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser(BaseSessionStateBuilder.scala:137)
at
org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:374)
at
org.apache.spark.sql.test.TestSparkSession.sessionState$lzycompute(TestSQLContext.scala:42)
at
org.apache.spark.sql.test.TestSparkSession.sessionState(TestSQLContext.scala:41)
at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:699)
at org.apache.spark.sql.SparkSession.read(SparkSession.scala:783)
at
io.glutenproject.execution.VeloxPartitionedTableTPCHSuite.$anonfun$createTPCHNotNullTables$1(VeloxTPCHSuite.scala:330)
at scala.collection.immutable.List.map(List.scala:293)
at
io.glutenproject.execution.VeloxPartitionedTableTPCHSuite.createTPCHNotNullTables(VeloxTPCHSuite.scala:326)
...
Cause: java.lang.ClassNotFoundException:
org.antlr.v4.runtime.misc.ParseCancellationException
at java.net.URLClassLoader.findClass(URLClassLoader.java:387)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at
org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser$lzycompute(BaseSessionStateBuilder.scala:138)
at
org.apache.spark.sql.internal.BaseSessionStateBuilder.sqlParser(BaseSessionStateBuilder.scala:137)
at
org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:374)
at
org.apache.spark.sql.test.TestSparkSession.sessionState$lzycompute(TestSQLContext.scala:42)
at
org.apache.spark.sql.test.TestSparkSession.sessionState(TestSQLContext.scala:41)
at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:699)
...
second error is first one passed is
```
VeloxPartitionedTableTPCHSuite:
SLF4J: Failed to load class "org.slf4j.impl.StaticMDCBinder".
SLF4J: Defaulting to no-operation MDCAdapter implementation.
SLF4J: See http://www.slf4j.org/codes.html#no_static_mdc_binder for further
details.
E0325 21:58:09.050992 3516659 Exceptions.h:69] Line:
/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/ep/build-velox/build/velox_ep/velox/exec/Task.cpp:1813,
Function:terminate, Expression: Cancelled, Source: RUNTIME, ErrorCode:
INVALID_STATE
*** RUN ABORTED ***
org.apache.spark.SparkException: Job aborted due to stage failure: Failed
to serialize task 2, not attempting to retry it. Exception during
serialization: java.io.NotSerializableException: org.apache.hadoop.fs.Path
Serialization stack:
- object not serializable (class: org.apache.hadoop.fs.Path, value:
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2311)
- element of array (index: 0)
- array (class [Lorg.apache.hadoop.fs.Path;, size 25)
- field (class: scala.collection.mutable.WrappedArray$ofRef, name:
array, type: class [Ljava.lang.Object;)
- object (class scala.collection.mutable.WrappedArray$ofRef,
WrappedArray(file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2311,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2332,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2334,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2325,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2333,
file:/home/gayangy
a/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2341,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2324,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2314,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2315,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2352,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.executi
on.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2353,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2355,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2345,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2323,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2354,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2335,
file:/home/gayangya/Work/gaoyangxiaozhu/spa
rk35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2351,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2342,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2312,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2321,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2331,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableT
PCHSuite/part/p_brand=Brand%2344,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2322,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2343,
file:/home/gayangya/Work/gaoyangxiaozhu/spark35/incubator-gluten/backends-velox/spark-warehouse/io.glutenproject.execution.VeloxPartitionedTableTPCHSuite/part/p_brand=Brand%2313))
- writeObject data (class:
org.apache.spark.rdd.ParallelCollectionPartition)
- object (class org.apache.spark.rdd.ParallelCollectionPartition,
org.apache.spark.rdd.ParallelCollectionPartition@735)
- field (class: org.apache.spark.scheduler.ResultTask, name:
partition, type: interface org.apache.spark.Partition)
- object (class org.apache.spark.scheduler.ResultTask, ResultTask(2,
0))
at
org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2856)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2792)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2791)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at
scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2791)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1247)
at
org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1247)
at scala.Option.foreach(Option.scala:407)
...
```
### System information
PRETTY_NAME="Ubuntu 22.04.3 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04.3 LTS (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy
### CMake log
_No response_
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]