allenfan2 opened a new issue #1163:
URL: https://github.com/apache/iceberg/issues/1163


   So I'm just following the [Getting 
Started](https://iceberg.apache.org/getting-started/)
   
   All of this is being tested in spark-shell 2.3.0 with the 
iceberg-spark-runtime-0.8.0.jar
   
   And when I run the step:
   `val table = catalog.createTable(name, schema)`
   
   This error occurs
   
   `scala> val iceberg_table = catalog.createTable(name, schema)
   java.lang.RuntimeException: Metastore operation failed for 
iceberg_test.test_table
     at 
org.apache.iceberg.hive.HiveTableOperations.doCommit(HiveTableOperations.java:206)
     at 
org.apache.iceberg.BaseMetastoreTableOperations.commit(BaseMetastoreTableOperations.java:103)
     at 
org.apache.iceberg.BaseMetastoreCatalog.createTable(BaseMetastoreCatalog.java:70)
     at org.apache.iceberg.catalog.Catalog.createTable(Catalog.java:109)
     ... 49 elided
   Caused by: org.apache.hadoop.hive.metastore.api.MetaException: 
file:/home/username/iceberg-test/spark-warehouse/iceberg_test.db/test_table is 
not a directory or unable to create one
     at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29983)
     at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result$create_table_with_environment_context_resultStandardScheme.read(ThriftHiveMetastore.java:29951)
     at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_table_with_environment_context_result.read(ThriftHiveMetastore.java:29877)
     at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:86)
     at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_table_with_environment_context(ThriftHiveMetastore.java:1075)
     at 
org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_table_with_environment_context(ThriftHiveMetastore.java:1061)
     at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.create_table_with_environment_context(HiveMetaStoreClient.java:2050)
     at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:669)
     at 
org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:657)
     at 
org.apache.iceberg.hive.HiveTableOperations.lambda$doCommit$3(HiveTableOperations.java:191)
     at org.apache.iceberg.hive.ClientPool.run(ClientPool.java:54)
     at 
org.apache.iceberg.hive.HiveTableOperations.doCommit(HiveTableOperations.java:190)
     ... 52 more
   `
   
   Thing is I launched spark-shell while i was in `/home/username/iceberg-test` 
and it seems that it automatically created 
   `spark-warehouse/iceberg_test.db/test_table` while running the code within 
it and yet its saying its not a directory or can't create one. Is this because 
its trying to look for it in HDFS rather than the UNIX FS?
   
   `tree -d spark-warehouse
   spark-warehouse
   └── iceberg_test.db
       └── test_table
           └── metadata
   `


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to