Xander-run commented on issue #6697:
URL: https://github.com/apache/gravitino/issues/6697#issuecomment-2743343893

   Hi @FANNG1 
   
   I'm interested in solving this issue and have attempted to reproduce it 
using the `gravitino-playground` by following the documented steps in 
[#using-spark-client](https://gravitino.apache.org/docs/0.8.0-incubating/how-to-use-the-playground#using-spark-client)
   
   However, I’m unable to find any catalog other than `spark_catalog`. I can 
access the pg catalog from the trino client as well as through the gravitino 
API.
   
   <img width="1721" alt="Image" 
src="https://github.com/user-attachments/assets/a593a8b3-8f57-447a-a3a7-3dc8ca22816b";
 />
   
   ```
   docker exec -it playground-spark bash
   
   
   spark@3f18a8074681:/opt/spark/work-dir$ cd /opt/spark && /bin/bash 
bin/spark-sql
   Setting default log level to "WARN".
   To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use 
setLogLevel(newLevel).
   25/03/21 13:07:05 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
   25/03/21 13:07:06 WARN GravitinoDriverPlugin: Skip registering catalog_mysql 
because jdbc-mysql is not supported yet.
   25/03/21 13:07:06 WARN GravitinoDriverPlugin: Skip registering 
catalog_postgres because jdbc-postgresql is not supported yet.
   25/03/21 13:07:06 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout 
does not exist
   25/03/21 13:07:06 WARN HiveConf: HiveConf of name hive.stats.retries.wait 
does not exist
   25/03/21 13:07:07 WARN ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 2.3.0
   25/03/21 13:07:07 WARN ObjectStore: setMetaStoreSchemaVersion called but 
recording version is disabled: version = 2.3.0, comment = Set by MetaStore 
[email protected]
   Spark master: local[*], Application Id: local-1742562426448
   25/03/21 13:07:08 WARN SparkSQLCLIDriver: WARNING: Directory for Hive 
history file: /home/spark does not exist.   History will not be available 
during this session.
   spark-sql (default)> SHOW CATALOGS;
   spark_catalog
   Time taken: 1.17 seconds, Fetched 1 row(s)
   spark-sql (default)> 
   ```
   
   This warning looks highly related:
   
   ```
   25/03/21 13:07:06 WARN GravitinoDriverPlugin: Skip registering 
catalog_postgres because jdbc-postgresql is not supported yet.
   ```
   
   So I guess my question might be: Could you clarify how to access 
`pg_catalog` from spark or to enable the jdbc-postgresql?
   
   Thanks!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to