foryou7242 commented on issue #4932:
URL: https://github.com/apache/gravitino/issues/4932#issuecomment-2368434057
@FANNG1 Thank you so much for your help.
First of all, the `spark.bypass.spark.sql.hive.metastore.jars` setting is
already set to `builtin`.
The root cause is that `hive.metastore.uris` is set to test2 hive metastore
in hive-site.xmlx. I deleted that setting and checked that it is queried
normally when querying with spark-shell.
However, I still have the same problem with spark-sql, am I right in
understanding that this is an issue that will be fixed in the future?
```
scala> val df = spark.table("test2.test").filter($"month" === "202009" &&
$"day" === "20200928").limit(100)
24/09/23 23:20:02 INFO HiveConf: Found configuration file
file:/opt/spark/conf/hive-site.xml
24/09/23 23:20:02 INFO metastore: Trying to connect to metastore with URI
thrift://test1:9083
24/09/23 23:20:02 INFO metastore: Opened a connection to metastore, current
connections: 1
24/09/23 23:20:02 INFO metastore: Connected to metastore.
df: org.apache.spark.sql.Dataset[org.apache.spark.sql.Row] = [test1: string,
month: string ... 1 more field]
scala> df.show(false)
+------------------------------------+------+--------+
|test1 |month |day |
+------------------------------------+------+--------+
|1231231235555555557|202009|20200928|
|1231231235555555557|202009|20200928|
|1231231235555555557|202009|20200928|
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]