Fokko commented on PR #5730:
URL: https://github.com/apache/iceberg/pull/5730#issuecomment-1244647107

   Hi @InvisibleProgrammer thanks for opening this PR. Unfortunately, I cannot 
reproduce this and therefore close this for now. Feel free to re-open the PR 
with more information. Thanks!
   
   ```
   ➜  docker-spark-iceberg-2 git:(main) ✗ spark-sql --packages 
org.apache.iceberg:iceberg-spark-runtime-3.2_2.12:0.14.1\
       --conf 
spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
 \
       --conf 
spark.sql.catalog.spark_catalog=org.apache.iceberg.spark.SparkSessionCatalog \
       --conf spark.sql.catalog.spark_catalog.type=hive \
       --conf spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog \
       --conf spark.sql.catalog.local.type=hadoop \
       --conf spark.sql.catalog.local.warehouse=$PWD/warehouse
   :: loading settings :: url = 
jar:file:/opt/homebrew/Cellar/apache-spark/3.3.0/libexec/jars/ivy-2.5.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
   Ivy Default Cache set to: /Users/fokkodriesprong/.ivy2/cache
   The jars for the packages stored in: /Users/fokkodriesprong/.ivy2/jars
   org.apache.iceberg#iceberg-spark-runtime-3.2_2.12 added as a dependency
   :: resolving dependencies :: 
org.apache.spark#spark-submit-parent-20feccbe-351a-4fe7-89ec-e906355198d8;1.0
        confs: [default]
        found org.apache.iceberg#iceberg-spark-runtime-3.2_2.12;0.14.1 in 
central
   :: resolution report :: resolve 69ms :: artifacts dl 2ms
        :: modules in use:
        org.apache.iceberg#iceberg-spark-runtime-3.2_2.12;0.14.1 from central 
in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------
   :: retrieving :: 
org.apache.spark#spark-submit-parent-20feccbe-351a-4fe7-89ec-e906355198d8
        confs: [default]
        0 artifacts copied, 1 already retrieved (0kB/2ms)
   22/09/12 15:42:43 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   22/09/12 15:42:43 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
Attempting port 4041.
   22/09/12 15:42:44 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout 
does not exist
   22/09/12 15:42:44 WARN HiveConf: HiveConf of name hive.stats.retries.wait 
does not exist
   22/09/12 15:42:45 WARN ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 2.3.0
   22/09/12 15:42:45 WARN ObjectStore: setMetaStoreSchemaVersion called but 
recording version is disabled: version = 2.3.0, comment = Set by MetaStore 
[email protected]
   22/09/12 15:42:45 WARN ObjectStore: Failed to get database default, 
returning NoSuchObjectException
   Spark master: local[*], Application Id: local-1663022564021
   spark-sql> use demo;
   22/09/12 15:42:49 WARN ObjectStore: Failed to get database global_temp, 
returning NoSuchObjectException
   22/09/12 15:42:49 WARN ObjectStore: Failed to get database demo, returning 
NoSuchObjectException
   Error in query: Database 'demo' not found
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to