[
https://issues.apache.org/jira/browse/HIVE-29145?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Denys Kuzmenko reassigned HIVE-29145:
-------------------------------------
Assignee: Shohei Okumiya
> HMS Iceberg Rest server can not work properly.
> ----------------------------------------------
>
> Key: HIVE-29145
> URL: https://issues.apache.org/jira/browse/HIVE-29145
> Project: Hive
> Issue Type: Bug
> Components: Iceberg integration
> Affects Versions: 4.1.0
> Reporter: Butao Zhang
> Assignee: Shohei Okumiya
> Priority: Major
> Labels: pull-request-available
> Fix For: 4.2.0
>
>
>
> {*}NOTE{*}: *The Iceberg REST Catalog server works properly if it is lanched
> by [standalone
> HMS|https://dlcdn.apache.org/hive/hive-standalone-metastore-4.1.0/hive-standalone-metastore-4.1.0-bin.tar.gz],
> but it can not work if it is lanched by [common
> HMS|https://dlcdn.apache.org/hive/hive-4.1.0/apache-hive-4.1.0-bin.tar.gz].*
>
> I tested the HMS iceberg rest catalog against Spark:
> 1. HMS configuration
> {code:java}
> <property>
> <name>hive.metastore.catalog.servlet.port</name>
> <value>9088</value>
> <description>iceberg rest catalog port</description>
> </property>
> <property>
> <name>hive.metastore.catalog.servlet.auth</name>
> <value>none</value>
> <description></description>
> </property>
> <property>
> <name>hive.metastore.properties.servlet.auth</name>
> <value>simple</value>
> <description></description>
> </property>
> {code}
> 2. Prepare Spark env:
> Download spark package spark-3.5.6-bin-hadoop3.tgz, and put
> [spark-iceberg|https://search.maven.org/remotecontent?filepath=org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.9.2/iceberg-spark-runtime-3.5_2.12-1.9.2.jar]
> runtime jar into spark jar directory.
> 3. Test spark + HMS iceberg rest catalog
> {code:java}
> ./spark-3.5.6-bin-hadoop3/bin/spark-sql \
> --master local \
> --deploy-mode client \
> --conf
> spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
> \
> --conf spark.sql.catalog.rest=org.apache.iceberg.spark.SparkCatalog \
> --conf spark.sql.catalog.rest.type=rest \
> --conf spark.sql.catalog.rest.uri=http://127.0.0.1:9088/iceberg/
> spark-sql (default)> use rest;
> {code}
> 4. You will find error log both in spark side and HMS server side:
> * Spark error log
> {code:java}
> Server error: null: {
> "servlet":"org.apache.iceberg.rest.HMSCatalogServlet-7e848aea",
> "cause0":"java.lang.NoSuchMethodError: 'com.codahale.metrics.Counter
> org.apache.hadoop.hive.metastore.metrics.Metrics.getOrCreateCounter(java.lang.String)'",
> "message":"java.lang.NoSuchMethodError: 'com.codahale.metrics.Counter
> org.apache.hadoop.hive.metastore.metrics.Metrics.getOrCreateCounter(java.lang.String)'",
> "url":"/iceberg/v1/config",
> "status":"500"
> }
> org.apache.iceberg.exceptions.ServiceFailureException: Server error: null: {
> "servlet":"org.apache.iceberg.rest.HMSCatalogServlet-7e848aea",
> "cause0":"java.lang.NoSuchMethodError: 'com.codahale.metrics.Counter
> org.apache.hadoop.hive.metastore.metrics.Metrics.getOrCreateCounter(java.lang.String)'",
> "message":"java.lang.NoSuchMethodError: 'com.codahale.metrics.Counter
> org.apache.hadoop.hive.metastore.metrics.Metrics.getOrCreateCounter(java.lang.String)'",
> "url":"/iceberg/v1/config",
> "status":"500"
> }
> at
> org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:241)
> at
> org.apache.iceberg.rest.ErrorHandlers$DefaultErrorHandler.accept(ErrorHandlers.java:212)
> at
> org.apache.iceberg.rest.HTTPClient.throwFailure(HTTPClient.java:215)
> at org.apache.iceberg.rest.HTTPClient.execute(HTTPClient.java:299)
> at org.apache.iceberg.rest.BaseHTTPClient.get(BaseHTTPClient.java:77)
> at
> org.apache.iceberg.rest.RESTSessionCatalog.fetchConfig(RESTSessionCatalog.java:1021)
> at
> org.apache.iceberg.rest.RESTSessionCatalog.initialize(RESTSessionCatalog.java:202)
> at org.apache.iceberg.rest.RESTCatalog.initialize(RESTCatalog.java:82)
> at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:277)
> at
> org.apache.iceberg.CatalogUtil.buildIcebergCatalog(CatalogUtil.java:331)
> at
> org.apache.iceberg.spark.SparkCatalog.buildIcebergCatalog(SparkCatalog.java:153)
> at
> org.apache.iceberg.spark.SparkCatalog.initialize(SparkCatalog.java:752)
> at
> org.apache.spark.sql.connector.catalog.Catalogs$.load(Catalogs.scala:65)
> at
> org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$catalog$1(CatalogManager.scala:54)
> at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
> {code}
> * HMS server error log
> {code:java}
> 2025-08-14T15:55:47,203 WARN [qtp1558127130-51] server.HttpChannel:
> /iceberg/v1/config
> java.lang.NoSuchMethodError: 'com.codahale.metrics.Counter
> org.apache.hadoop.hive.metastore.metrics.Metrics.getOrCreateCounter(java.lang.String)'
> at
> org.apache.iceberg.rest.HMSCatalogAdapter.handleRequest(HMSCatalogAdapter.java:538)
> ~[hive-standalone-metastore-rest-catalog-4.1.0.jar:4.1.0]
> at
> org.apache.iceberg.rest.HMSCatalogAdapter.execute(HMSCatalogAdapter.java:632)
> ~[hive-standalone-metastore-rest-catalog-4.1.0.jar:4.1.0]
> at
> org.apache.iceberg.rest.HMSCatalogServlet.service(HMSCatalogServlet.java:80)
> ~[hive-standalone-metastore-rest-catalog-4.1.0.jar:4.1.0]
> at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
> ~[javax.servlet-api-3.1.0.jar:3.1.0]
> at
> org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:799)
> ~[jetty-servlet-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:554)
> ~[jetty-servlet-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505)
> ~[jetty-servlet-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:191)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at org.eclipse.jetty.server.Server.handle(Server.java:516)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
> ~[jetty-server-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
> ~[jetty-io-9.4.57.v20241219.jar:9.4.57.v20241219]
> at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
> ~[jetty-io-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
> ~[jetty-io-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
> ~[jetty-util-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
> ~[jetty-util-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
> ~[jetty-util-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)
> ~[jetty-util-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
> [jetty-util-9.4.57.v20241219.jar:9.4.57.v20241219]
> at
> org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
> [jetty-util-9.4.57.v20241219.jar:9.4.57.v20241219]
> at java.base/java.lang.Thread.run(Thread.java:833) [?:?]
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)