ajantha-bhat opened a new issue, #5986:
URL: https://github.com/apache/iceberg/issues/5986

   The below test case could be flaky!
   
   org.apache.iceberg.spark.TestFileRewriteCoordinator > 
testBinPackRewrite[catalogName = testhive
   
   
https://github.com/apache/iceberg/actions/runs/3249433325/jobs/5331831494#step:6:293
   
   ```
   org.apache.iceberg.spark.TestFileRewriteCoordinator > 
testBinPackRewrite[catalogName = testhive, implementation = 
org.apache.iceberg.spark.SparkCatalog, config = {type=hive, 
default-namespace=default}] FAILED
       org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: The 
root scratch dir: /tmp/hive on HDFS should be writable. Current permissions 
are: rwxr-xr-x;
           at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:113)
           at 
org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:225)
           at 
org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:137)
           at 
org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:127)
           at 
org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:157)
           at 
org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:155)
           at 
org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$2(HiveSessionStateBuilder.scala:60)
           at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager$lzycompute(SessionCatalog.scala:93)
           at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager(SessionCatalog.scala:93)
           at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:206)
           at 
org.apache.spark.sql.execution.command.CreateDatabaseCommand.run(ddl.scala:81)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
           at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
           at 
org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:229)
           at 
org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3618)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:100)
           at 
org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:160)
           at 
org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:87)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
           at 
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3616)
           at org.apache.spark.sql.Dataset.<init>(Dataset.scala:229)
           at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
           at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
           at 
org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:610)
           at 
org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:767)
           at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:605)
           at org.apache.iceberg.spark.SparkTestBase.sql(SparkTestBase.java:103)
           at 
org.apache.iceberg.spark.SparkTestBaseWithCatalog.<init>(SparkTestBaseWithCatalog.java:87)
           at 
org.apache.iceberg.spark.SparkCatalogTestBase.<init>(SparkCatalogTestBase.java:60)
           at 
org.apache.iceberg.spark.TestFileRewriteCoordinator.<init>(TestFileRewriteCoordinator.java:48)
   
           Caused by:
           java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS 
should be writable. Current permissions are: rwxr-xr-x
               at 
org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:724)
               at 
org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:654)
               at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:586)
               at 
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:548)
               at 
org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:176)
               at 
org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:129)
               at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
               at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
               at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
               at 
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
               at 
org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:[301](https://github.com/apache/iceberg/actions/runs/3249433325/jobs/5331831494#step:6:302))
               at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:431)
               at 
org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:[324](https://github.com/apache/iceberg/actions/runs/3249433325/jobs/5331831494#step:6:325))
               at 
org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:72)
               at 
org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:71)
               at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:225)
               at 
scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23)
               at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:103)
               ... 32 more
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to