szehon-ho opened a new issue #4383:
URL: https://github.com/apache/iceberg/issues/4383


   On Spark 3, hit this error:
   
   ```
   org.apache.iceberg.spark.extensions.TestCopyOnWriteMerge > 
testMergeWithSnapshotIsolation[catalogName = testhive, implementation = 
org.apache.iceberg.spark.SparkCatalog, config = {type=hive, 
default-namespace=default}, format = orc, vectorized = true, distributionMode = 
none] FAILED
       java.util.concurrent.ExecutionException: 
java.util.ConcurrentModificationException
           at java.util.concurrent.FutureTask.report(FutureTask.java:122)
           at java.util.concurrent.FutureTask.get(FutureTask.java:192)
           at 
org.apache.iceberg.spark.extensions.TestMerge.testMergeWithSnapshotIsolation(TestMerge.java:720)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
           at 
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
           at 
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
           at 
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
           at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
           at 
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
           at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
           at 
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
           at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
           at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
           at 
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
           at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
           at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
           at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
           at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
           at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
           at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
           at org.junit.runners.Suite.runChild(Suite.java:128)
           at org.junit.runners.Suite.runChild(Suite.java:27)
           at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
           at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
           at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
           at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
           at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
           at 
org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
           at 
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
           at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
           at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
           at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.runTestClass(JUnitTestClassExecutor.java:110)
           at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:58)
           at 
org.gradle.api.internal.tasks.testing.junit.JUnitTestClassExecutor.execute(JUnitTestClassExecutor.java:38)
           at 
org.gradle.api.internal.tasks.testing.junit.AbstractJUnitTestClassProcessor.processTestClass(AbstractJUnitTestClassProcessor.java:62)
           at 
org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.processTestClass(SuiteTestClassProcessor.java:51)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
           at 
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
           at 
org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
           at 
org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
           at com.sun.proxy.$Proxy2.processTestClass(Unknown Source)
           at 
org.gradle.api.internal.tasks.testing.worker.TestWorker$2.run(TestWorker.java:176)
           at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
           at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
           at 
org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
           at 
org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
           at 
org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:133)
           at 
org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:71)
           at 
worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
           at 
worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)
   
           Caused by:
           java.util.ConcurrentModificationException
               at java.util.Hashtable$Enumerator.next(Hashtable.java:1408)
               at 
org.apache.hadoop.conf.Configuration.iterator(Configuration.java:2453)
               at java.lang.Iterable.forEach(Iterable.java:74)
               at 
org.apache.iceberg.SerializableTable$SerializableConfSupplier.<init>(SerializableTable.java:373)
               at 
org.apache.iceberg.hadoop.HadoopFileIO.serializeConfWith(HadoopFileIO.java:90)
               at 
org.apache.iceberg.SerializableTable.fileIO(SerializableTable.java:111)
               at 
org.apache.iceberg.SerializableTable.<init>(SerializableTable.java:81)
               at 
org.apache.iceberg.SerializableTable.copyOf(SerializableTable.java:96)
               at 
org.apache.iceberg.spark.source.SparkBatchScan.planInputPartitions(SparkBatchScan.java:138)
               at 
org.apache.spark.sql.execution.datasources.v2.ExtendedBatchScanExec.partitions(ExtendedBatchScanExec.scala:49)
               at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar(DataSourceV2ScanExecBase.scala:61)
               at 
org.apache.spark.sql.execution.datasources.v2.DataSourceV2ScanExecBase.supportsColumnar$(DataSourceV2ScanExecBase.scala:60)
               at 
org.apache.spark.sql.execution.datasources.v2.ExtendedBatchScanExec.supportsColumnar(ExtendedBatchScanExec.scala:35)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.insertTransitions(Columnar.scala:502)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$insertTransitions$1(Columnar.scala:507)
               at scala.collection.immutable.List.map(List.scala:293)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.insertTransitions(Columnar.scala:507)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$insertTransitions$1(Columnar.scala:507)
               at scala.collection.immutable.List.map(List.scala:293)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.insertTransitions(Columnar.scala:507)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$insertTransitions$1(Columnar.scala:507)
               at scala.collection.immutable.List.map(List.scala:297)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.insertTransitions(Columnar.scala:507)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$insertTransitions$1(Columnar.scala:507)
               at scala.collection.immutable.List.map(List.scala:293)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.insertTransitions(Columnar.scala:507)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.$anonfun$insertTransitions$1(Columnar.scala:507)
               at scala.collection.immutable.List.map(List.scala:293)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.insertTransitions(Columnar.scala:507)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:515)
               at 
org.apache.spark.sql.execution.ApplyColumnarRulesAndInsertTransitions.apply(Columnar.scala:482)
               at 
org.apache.spark.sql.execution.QueryExecution$.$anonfun$prepareForExecution$1(QueryExecution.scala:[324](https://github.com/apache/iceberg/runs/5669494241?check_suite_focus=true#step:6:324))
               at 
scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
               at 
scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
               at scala.collection.immutable.List.foldLeft(List.scala:91)
               at 
org.apache.spark.sql.execution.QueryExecution$.prepareForExecution(QueryExecution.scala:324)
               at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executedPlan$1(QueryExecution.scala:112)
               at 
org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
               at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:138)
               at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
               at 
org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:138)
               at 
org.apache.spark.sql.execution.QueryExecution.executedPlan$lzycompute(QueryExecution.scala:112)
               at 
org.apache.spark.sql.execution.QueryExecution.executedPlan(QueryExecution.scala:105)
               at 
org.apache.spark.sql.execution.QueryExecution.$anonfun$writePlans$5(QueryExecution.scala:204)
               at 
org.apache.spark.sql.catalyst.plans.QueryPlan$.append(QueryPlan.scala:478)
               at 
org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$writePlans(QueryExecution.scala:204)
               at 
org.apache.spark.sql.execution.QueryExecution.toString(QueryExecution.scala:212)
               at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:95)
               at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
               at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
               at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
               at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
               at 
org.apache.spark.sql.Dataset.withAction(Dataset.scala:[361](https://github.com/apache/iceberg/runs/5669494241?check_suite_focus=true#step:6:361)6)
               at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
               at 
org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
               at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
               at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
               at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:610)
               at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
               at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:605)
               at 
org.apache.iceberg.spark.SparkTestBase.sql(SparkTestBase.java:101)
               at 
org.apache.iceberg.spark.extensions.TestMerge.lambda$testMergeWithSnapshotIsolation$4(TestMerge.java:700)
               at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
               at java.util.concurrent.FutureTask.run(FutureTask.java:266)
               at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
               at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
               at java.lang.Thread.run(Thread.java:750)
   ```
   
   Ref:  https://github.com/apache/iceberg/runs/5669494241


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to