CTTY opened a new pull request, #9595:
URL: https://github.com/apache/hudi/pull/9595

   
   ### Change Logs
   
   When table/database is not found when syncing table to Glue, glue should 
return `EntityNotFoundException`.
   After upgrading to AWS SDK V2, Hudi uses `GlueAsyncClient` to get a 
`CompletableFuture`, which would throw `ExecutionException` with 
`EntityNotFoundException` nested when table/database doesn't exist. However, 
existing Hudi code doesn't handle `ExecutionException` and would fail the job.
   
   Sample exception:
   ```
   org.apache.hudi.exception.HoodieMetaSyncException: Could not sync using the 
meta sync class org.apache.hudi.aws.sync.AwsGlueCatalogSyncTool
     at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:81)
     at 
org.apache.hudi.HoodieSparkSqlWriter$.$anonfun$metaSync$2(HoodieSparkSqlWriter.scala:959)
     at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
     at 
org.apache.hudi.HoodieSparkSqlWriter$.metaSync(HoodieSparkSqlWriter.scala:957)
     at 
org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:1055)
     at 
org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:409)
     at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:150)
     at 
org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:47)
     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:75)
     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:73)
     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:84)
     at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:104)
     at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
     at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:250)
     at 
org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:123)
     at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$9(SQLExecution.scala:160)
     at 
org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
     at 
org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:250)
     at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$8(SQLExecution.scala:160)
     at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:271)
     at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:159)
     at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827)
     at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:69)
     at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:101)
     at 
org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:97)
     at 
org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:554)
     at 
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:107)
     at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:554)
     at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32)
     at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267)
     at 
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263)
     at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
     at 
org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32)
     at 
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:530)
     at 
org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:97)
     at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:84)
     at 
org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:82)
     at 
org.apache.spark.sql.execution.QueryExecution.assertCommandExecuted(QueryExecution.scala:142)
     at 
org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:856)
     at 
org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:387)
     at 
org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:360)
     at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:239)
     ... 47 elided
   Caused by: org.apache.hudi.exception.HoodieException: Got runtime exception 
when hive syncing mdt_athena_12904
     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:168)
     at 
org.apache.hudi.sync.common.util.SyncUtilHelpers.runHoodieMetaSync(SyncUtilHelpers.java:79)
     ... 88 more
   Caused by: org.apache.hudi.aws.sync.HoodieGlueSyncException: Fail to get 
table: default.mdt_athena_12904
     at 
org.apache.hudi.aws.sync.AWSGlueCatalogSyncClient.tableExists(AWSGlueCatalogSyncClient.java:472)
     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:241)
     at org.apache.hudi.hive.HiveSyncTool.doSync(HiveSyncTool.java:177)
     at org.apache.hudi.hive.HiveSyncTool.syncHoodieTable(HiveSyncTool.java:165)
     ... 89 more
   Caused by: java.util.concurrent.ExecutionException: 
org.apache.hudi.software.amazon.awssdk.services.glue.model.EntityNotFoundException:
 Entity Not Found (Service: Glue, Status Code: 400, Request ID: 
eb86d3b6-012f-464c-a62e-91a385e48b0c)
     at 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
     at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
     at 
org.apache.hudi.aws.sync.AWSGlueCatalogSyncClient.tableExists(AWSGlueCatalogSyncClient.java:458)
     ... 92 more
   Caused by: 
org.apache.hudi.software.amazon.awssdk.services.glue.model.EntityNotFoundException:
 Entity Not Found (Service: Glue, Status Code: 400, Request ID: 
eb86d3b6-012f-464c-a62e-91a385e48b0c)
     at 
org.apache.hudi.software.amazon.awssdk.services.glue.model.EntityNotFoundException$BuilderImpl.build(EntityNotFoundException.java:126)
     at 
org.apache.hudi.software.amazon.awssdk.services.glue.model.EntityNotFoundException$BuilderImpl.build(EntityNotFoundException.java:80)
     at 
org.apache.hudi.software.amazon.awssdk.protocols.json.internal.unmarshall.AwsJsonProtocolErrorUnmarshaller.unmarshall(AwsJsonProtocolErrorUnmarshaller.java:92)
     at 
org.apache.hudi.software.amazon.awssdk.protocols.json.internal.unmarshall.AwsJsonProtocolErrorUnmarshaller.handle(AwsJsonProtocolErrorUnmarshaller.java:66)
     at 
org.apache.hudi.software.amazon.awssdk.protocols.json.internal.unmarshall.AwsJsonProtocolErrorUnmarshaller.handle(AwsJsonProtocolErrorUnmarshaller.java:41)
     at 
org.apache.hudi.software.amazon.awssdk.core.http.MetricCollectingHttpResponseHandler.lambda$handle$0(MetricCollectingHttpResponseHandler.java:52)
     at 
org.apache.hudi.software.amazon.awssdk.core.internal.util.MetricUtils.measureDurationUnsafe(MetricUtils.java:63)
     at 
org.apache.hudi.software.amazon.awssdk.core.http.MetricCollectingHttpResponseHandler.handle(MetricCollectingHttpResponseHandler.java:52)
     at 
org.apache.hudi.software.amazon.awssdk.core.internal.http.async.AsyncResponseHandler.lambda$prepare$0(AsyncResponseHandler.java:89)
     at 
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
     at 
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
     at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
     at 
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
     at 
org.apache.hudi.software.amazon.awssdk.core.internal.http.async.AsyncResponseHandler$BaosSubscriber.onComplete(AsyncResponseHandler.java:132)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.ResponseHandler$DataCountingPublisher$1.onComplete(ResponseHandler.java:513)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.ResponseHandler.runAndLogError(ResponseHandler.java:250)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.ResponseHandler.access$600(ResponseHandler.java:75)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.ResponseHandler$PublisherAdapter$1.onComplete(ResponseHandler.java:371)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.nrs.HandlerPublisher.publishMessage(HandlerPublisher.java:402)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.nrs.HandlerPublisher.flushBuffer(HandlerPublisher.java:338)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.nrs.HandlerPublisher.receivedDemand(HandlerPublisher.java:291)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.nrs.HandlerPublisher.access$200(HandlerPublisher.java:61)
     at 
org.apache.hudi.software.amazon.awssdk.http.nio.netty.internal.nrs.HandlerPublisher$ChannelSubscription$1.run(HandlerPublisher.java:495)
     at 
io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174)
     at 
io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167)
     at 
io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:566)
     at 
io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
     at 
io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
     at java.lang.Thread.run(Thread.java:750)
   ```
   
   ### Impact
   
   None
   
   ### Risk level (write none, low medium or high below)
   
   Low
   
   ### Documentation Update
   
   None
   
   ### Contributor's checklist
   
   - [ ] Read through [contributor's 
guide](https://hudi.apache.org/contribute/how-to-contribute)
   - [ ] Change Logs and Impact were stated clearly
   - [ ] Adequate tests were added if applicable
   - [ ] CI passed
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to