[ 
https://issues.apache.org/jira/browse/SPARK-30686?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17027291#comment-17027291
 ] 

Behroz Sikander commented on SPARK-30686:
-----------------------------------------

Could it be linked to [https://github.com/apache/spark/pull/19748] ?

> Spark 2.4.4 metrics endpoint throwing error
> -------------------------------------------
>
>                 Key: SPARK-30686
>                 URL: https://issues.apache.org/jira/browse/SPARK-30686
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.4
>            Reporter: Behroz Sikander
>            Priority: Major
>
> I am using Spark-standalone in HA mode with zookeeper.
> Once the driver is up and running, whenever I try to access the metrics api 
> using the following URL
> http://master_address/proxy/app-20200130041234-0123/api/v1/applications
> I get the following exception.
> It seems that the request never even reaches the spark code. It would be 
> helpful if somebody can help me.
> {code:java}
> HTTP ERROR 500
> Problem accessing /api/v1/applications. Reason:
>     Server Error
> Caused by:
> java.lang.NullPointerException: while trying to invoke the method 
> org.glassfish.jersey.servlet.WebComponent.service(java.net.URI, java.net.URI, 
> javax.servlet.http.HttpServletRequest, 
> javax.servlet.http.HttpServletResponse) of a null object loaded from field 
> org.glassfish.jersey.servlet.ServletContainer.webComponent of an object 
> loaded from local variable 'this'
>       at 
> org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388)
>       at 
> org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341)
>       at 
> org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228)
>       at 
> org.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
>       at 
> org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:584)
>       at 
> org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
>       at 
> org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
>       at 
> org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
>       at 
> org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>       at 
> org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:493)
>       at 
> org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
>       at 
> org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
>       at org.spark_project.jetty.server.Server.handle(Server.java:539)
>       at 
> org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:333)
>       at 
> org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
>       at 
> org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:283)
>       at 
> org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:108)
>       at 
> org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
>       at 
> org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
>       at 
> org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
>       at 
> org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
>       at 
> org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
>       at 
> org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
>       at java.lang.Thread.run(Thread.java:808)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to