lucienoz opened a new issue, #8791: URL: https://github.com/apache/hudi/issues/8791
**_Tips before filing an issue_** - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? - Join the mailing list to engage in conversations and get faster support at [email protected]. - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly. **Describe the problem you faced** A clear and concise description of the problem. **To Reproduce** Steps to reproduce the behavior: 1. spark-shell \ --conf 'spark.serializer=org.apache.spark.serializer.KryoSerializer' \ --conf 'spark.sql.catalog.spark_catalog=org.apache.spark.sql.hudi.catalog.HoodieCatalog' \ --conf 'spark.sql.extensions=org.apache.spark.sql.hudi.HoodieSparkSessionExtension' 2. import org.apache.hudi.QuickstartUtils._ import scala.collection.JavaConversions._ import org.apache.spark.sql.SaveMode._ import org.apache.hudi.DataSourceReadOptions._ import org.apache.hudi.DataSourceWriteOptions._ import org.apache.hudi.config.HoodieWriteConfig._ 3. val tableName = "hudi_trips_cow" val basePath = "hdfs:///ods/hudi_trips_cow" val dataGen = new DataGenerator 4. val inserts = convertToStringList(dataGen.generateInserts(10)) val df = spark.read.json(spark.sparkContext.parallelize(inserts, 2)) df.write.format("hudi"). options(getQuickstartWriteConfigs). option(PRECOMBINE_FIELD_OPT_KEY, "ts"). option(RECORDKEY_FIELD_OPT_KEY, "uuid"). option(PARTITIONPATH_FIELD_OPT_KEY, "partitionpath"). option(TABLE_NAME, tableName). mode(Overwrite). save(basePath) **Expected behavior** **Environment Description** baseOs : centos 7.9 * Hudi version : 0.13.0 * Spark version : 3.3.2 * Hive version : 3.1.2 * Hadoop version : 3.3.5 * Storage (HDFS/S3/GCS..) : hdfs * Running on Docker? (yes/no) : no **Additional context** Add any other context about the problem here. **Stacktrace** ``` 23/05/24 10:54:22 WARN DFSPropertiesConfiguration: Cannot find HUDI_CONF_DIR, please set it as the dir of hudi-defaults.conf 23/05/24 10:54:22 WARN DFSPropertiesConfiguration: Properties file file:/etc/hudi/conf/hudi-defaults.conf not found. Ignoring to load props file 23/05/24 10:54:22 INFO HiveConf: Found configuration file file:/opt/modules/spark-3.3.2-bin-hadoop3/conf/hive-site.xml 23/05/24 10:54:22 WARN HiveConf: HiveConf of name hive.metastore.event.db.notification.api.auth does not exist 23/05/24 10:54:22 WARN HiveConf: HiveConf of name hive.server2.active.passive.ha.enable does not exist 23/05/24 10:54:22 INFO metastore: Trying to connect to metastore with URI thrift://idc-bigdata-03:9083 23/05/24 10:54:22 INFO metastore: Opened a connection to metastore, current connections: 1 23/05/24 10:54:22 INFO metastore: Connected to metastore. 23/05/24 10:54:23 WARN HoodieBackedTableMetadata: Metadata table was not found at path hdfs:/ods/hudi_trips_cow/.hoodie/metadata 23/05/24 10:54:23 WARN HoodieWriteConfig: Embedded timeline server is disabled, fallback to use direct marker type for spark 23/05/24 10:54:25 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 2) (idc-bigdata-01 executor 2): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2302) at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1432) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2308) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:85) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 23/05/24 10:54:25 ERROR TaskSetManager: Task 0 in stage 1.0 failed 4 times; aborting job org.apache.hudi.exception.HoodieUpsertException: Failed to upsert for commit time 20230524105422232 at org.apache.hudi.table.action.commit.BaseWriteHelper.write(BaseWriteHelper.java:75) at org.apache.hudi.table.action.commit.SparkUpsertCommitActionExecutor.execute(SparkUpsertCommitActionExecutor.java:44) at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.upsert(HoodieSparkCopyOnWriteTable.java:107) at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.upsert(HoodieSparkCopyOnWriteTable.java:96) at org.apache.hudi.client.SparkRDDWriteClient.upsert(SparkRDDWriteClient.java:140) at org.apache.hudi.DataSourceUtils.doWriteOperation(DataSourceUtils.java:206) at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:363) at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:150) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:47) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73) at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:98) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:109) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:95) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:94) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:584) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:584) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:30) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:30) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:560) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:94) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:81) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:79) at org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:116) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:860) at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:390) at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:363) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239) ... 66 elided Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most recent failure: Lost task 0.3 in stage 1.0 (TID 8) (idc-bigdata-02 executor 1): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2302) at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1432) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2308) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:85) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2672) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2608) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2607) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2607) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1182) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1182) at scala.Option.foreach(Option.scala:407) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1182) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2860) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2802) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2791) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:952) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2238) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2259) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2278) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2303) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:406) at org.apache.spark.rdd.RDD.collect(RDD.scala:1020) at org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:362) at org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:361) at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45) at org.apache.hudi.data.HoodieJavaRDD.collectAsList(HoodieJavaRDD.java:163) at org.apache.hudi.index.simple.HoodieSimpleIndex.fetchRecordLocationsForAffectedPartitions(HoodieSimpleIndex.java:142) at org.apache.hudi.index.simple.HoodieSimpleIndex.tagLocationInternal(HoodieSimpleIndex.java:113) at org.apache.hudi.index.simple.HoodieSimpleIndex.tagLocation(HoodieSimpleIndex.java:91) at org.apache.hudi.table.action.commit.HoodieWriteHelper.tag(HoodieWriteHelper.java:54) at org.apache.hudi.table.action.commit.HoodieWriteHelper.tag(HoodieWriteHelper.java:36) at org.apache.hudi.table.action.commit.BaseWriteHelper.write(BaseWriteHelper.java:64) ... 102 more Caused by: java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2302) at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1432) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2434) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2308) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527) at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1185) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2319) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2428) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2352) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2210) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1690) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:508) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:466) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:129) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:85) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:52) at org.apache.spark.scheduler.Task.run(Task.scala:136) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ``` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
