kbendick commented on issue #3585:
URL: https://github.com/apache/iceberg/issues/3585#issuecomment-1021737161


   Thank you for testing this @nreich 🙏 .
   
   I have tested with Spark 3.1.2 and Spark 3.2.0, both built for Hadoop 3.2., 
and I don't seem to get this problem. At least on the `CREATE TABLE` statement.
   
   ```bash
   cd spark-3.2.0-bin-hadoop3.2 && rm-rf /tmp/iceberg && mkdir -p 
/tmp/iceberg/warehouse && ./bin/spark-shell --packages 
'org.apache.iceberg:iceberg-spark-runtime-3.2_2.12:0.13.0' --repositories 
https://repository.apache.org/content/repositories/orgapacheiceberg-1079/ 
--conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
 --conf spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog  --conf 
spark.sql.catalog.local.type=hadoop --conf 
spark.sql.catalog.local.warehouse=/tmp/iceberg/warehouse
   
   scala> spark.sql("use local")
   
   scala> spark.sql("CREATE TABLE local.db.table (id bigint, data string) USING 
iceberg")
   res2: org.apache.spark.sql.DataFrame = []
   
   scala> spark.sql("show tables in local.db").show
   +---------+---------+-----------+
   |namespace|tableName|isTemporary|
   +---------+---------+-----------+
   |       db|    table|      false|
   +---------+---------+-----------+
   ```
   
   I also tried with Spark 3.1.2, as well as with a partitioned table and I did 
not encounter any exceptions.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to