Hans-Raintree commented on issue #9962:
URL: https://github.com/apache/hudi/issues/9962#issuecomment-1822738325

   Hey @CTTY 
   
   I've started using EMR 6.15.0 with the official jars, I'm getting these 
errors still:
   
   ```
   23/11/22 11:58:24 INFO AmazonDynamoDBLockClient: Heartbeat thread recieved 
interrupt, exiting run() (possibly exiting thread)
   java.lang.InterruptedException: sleep interrupted
        at java.lang.Thread.sleep(Native Method) ~[?:1.8.0_392]
        at 
com.amazonaws.services.dynamodbv2.AmazonDynamoDBLockClient.run(AmazonDynamoDBLockClient.java:1248)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_392]
   ```
   
   ```
   23/11/22 11:58:30 ERROR NettyNioAsyncHttpClient: Unable to close channel 
pools
   java.util.concurrent.RejectedExecutionException: event executor terminated
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.reject(SingleThreadEventExecutor.java:934)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.offerTask(SingleThreadEventExecutor.java:351)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.addTask(SingleThreadEventExecutor.java:344)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:836)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.execute0(SingleThreadEventExecutor.java:827)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
io.netty.util.concurrent.SingleThreadEventExecutor.execute(SingleThreadEventExecutor.java:817)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112)
 ~[?:1.8.0_392]
        at 
io.netty.util.concurrent.AbstractEventExecutor.submit(AbstractEventExecutor.java:118)
 ~[netty-common-4.1.87.Final.jar:4.1.87.Final]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.utils.NettyUtils.doInEventLoop(NettyUtils.java:254)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.http2.HttpOrHttp2ChannelPool.close(HttpOrHttp2ChannelPool.java:215)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.ListenerInvokingChannelPool.close(ListenerInvokingChannelPool.java:132)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.ReleaseOnceChannelPool.close(ReleaseOnceChannelPool.java:95)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.HealthCheckedChannelPool.close(HealthCheckedChannelPool.java:150)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.CancellableAcquireChannelPool.close(CancellableAcquireChannelPool.java:76)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.SimpleChannelPoolAwareChannelPool.close(SimpleChannelPoolAwareChannelPool.java:57)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.SdkChannelPoolMap.remove(SdkChannelPoolMap.java:56)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
java.util.concurrent.ConcurrentHashMap$KeySetView.forEach(ConcurrentHashMap.java:4647)
 ~[?:1.8.0_392]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.SdkChannelPoolMap.close(SdkChannelPoolMap.java:93)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.AwaitCloseChannelPoolMap.close(AwaitCloseChannelPoolMap.java:166)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.utils.NettyUtils.runAndLogError(NettyUtils.java:386)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.NettyNioAsyncHttpClient.close(NettyNioAsyncHttpClient.java:198)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.utils.IoUtils.closeQuietly(IoUtils.java:70)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.utils.IoUtils.closeIfCloseable(IoUtils.java:87)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.utils.AttributeMap.lambda$close$0(AttributeMap.java:87)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at java.util.HashMap$Values.forEach(HashMap.java:982) ~[?:1.8.0_392]
        at 
org.apache.hudi.software.amazon.awssdk.utils.AttributeMap.close(AttributeMap.java:87)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.core.client.config.SdkClientConfiguration.close(SdkClientConfiguration.java:79)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.core.internal.http.HttpClientDependencies.close(HttpClientDependencies.java:80)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.core.internal.http.AmazonAsyncHttpClient.close(AmazonAsyncHttpClient.java:73)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.core.internal.handler.BaseAsyncClientHandler.close(BaseAsyncClientHandler.java:253)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.software.amazon.awssdk.services.glue.DefaultGlueAsyncClient.close(DefaultGlueAsyncClient.java:13562)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.aws.sync.AWSGlueCatalogSyncClient.close(AWSGlueCatalogSyncClient.java:524)
 ~[hudi-aws-bundle-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at org.apache.hudi.hive.HiveSyncTool.close(HiveSyncTool.java:210) 
~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:80)
 ~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:993)
 ~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) 
~[scala-library-2.12.15.jar:?]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:991) 
~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:1089)
 ~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.writeInternal(HoodieSparkSqlWriter.scala:441)
 ~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:132) 
~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:150) 
~[hudi-spark3-bundle_2.12-0.14.0-amzn-0.jar:0.14.0-amzn-0]
        at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:47)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:104)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:250)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:123)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$9(SQLExecution.scala:160)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:250)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$8(SQLExecution.scala:160)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:271)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:159)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827) 
~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:69)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:101)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:554)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:107)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:554)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
 ~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:530) 
~[spark-catalyst_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:97)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:84)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:82)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:142)
 ~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:856) 
~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:387) 
~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:360) 
~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239) 
~[spark-sql_2.12-3.4.1-amzn-2.jar:3.4.1-amzn-2]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
~[?:1.8.0_392]
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
~[?:1.8.0_392]
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 ~[?:1.8.0_392]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_392]
        at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) 
~[py4j-0.10.9.7.jar:?]
        at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374) 
~[py4j-0.10.9.7.jar:?]
        at py4j.Gateway.invoke(Gateway.java:282) ~[py4j-0.10.9.7.jar:?]
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) 
~[py4j-0.10.9.7.jar:?]
        at py4j.commands.CallCommand.execute(CallCommand.java:79) 
~[py4j-0.10.9.7.jar:?]
        at 
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182) 
~[py4j-0.10.9.7.jar:?]
        at py4j.ClientServerConnection.run(ClientServerConnection.java:106) 
~[py4j-0.10.9.7.jar:?]
        at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_392]
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to