I was using 1.6.0. Sorry I forgot to mention that.
The full stack is shown below.
HTTP ERROR 500
Problem accessing /api/v1/applications/application_1457544696648_0002/jobs.
Reason:
Server Error
Caused by:
org.spark-project.guava.util.concurrent.UncheckedExecutionException:
java.util.NoSuchElementException: no app with key
application_1457544696648_0002
at
org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2263)
at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
at
org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
at
org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
at
org.apache.spark.deploy.history.HistoryServer.getSparkUI(HistoryServer.scala:118)
at
org.apache.spark.status.api.v1.UIRoot$class.withSparkUI(ApiRootResource.scala:226)
at
org.apache.spark.deploy.history.HistoryServer.withSparkUI(HistoryServer.scala:46)
at
org.apache.spark.status.api.v1.ApiRootResource.getJobs(ApiRootResource.scala:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.invokeSubLocator(SubLocatorRule.java:180)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:107)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at
org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at
org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
at
org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at
org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
at
org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at
org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at
org.spark-project.jetty.server.handler.GzipHandler.handle(GzipHandler.java:264)
at
org.spark-project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
at
org.spark-project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.spark-project.jetty.server.Server.handle(Server.java:370)
at
org.spark-project.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at
org.spark-project.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
at
org.spark-project.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
at
org.spark-project.jetty.http.HttpParser.parseNext(HttpParser.java:644)
at
org.spark-project.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at
org.spark-project.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at
org.spark-project.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
at
org.spark-project.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
at
org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at
org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.NoSuchElementException: no app with key
application_1457544696648_0002
at
org.apache.spark.deploy.history.HistoryServer$$anon$2$$anonfun$1.apply(HistoryServer.scala:62)
at
org.apache.spark.deploy.history.HistoryServer$$anon$2$$anonfun$1.apply(HistoryServer.scala:62)
at scala.Option.getOrElse(Option.scala:120)
at
org.apache.spark.deploy.history.HistoryServer$$anon$2.load(HistoryServer.scala:62)
at
org.apache.spark.deploy.history.HistoryServer$$anon$2.load(HistoryServer.scala:56)
at
org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
at
org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
at
org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
at
org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
... 46 more
Caused by:
java.util.NoSuchElementException: no app with key application_1457544696648_0002
at
org.apache.spark.deploy.history.HistoryServer$$anon$2$$anonfun$1.apply(HistoryServer.scala:62)
at
org.apache.spark.deploy.history.HistoryServer$$anon$2$$anonfun$1.apply(HistoryServer.scala:62)
at scala.Option.getOrElse(Option.scala:120)
at
org.apache.spark.deploy.history.HistoryServer$$anon$2.load(HistoryServer.scala:62)
at
org.apache.spark.deploy.history.HistoryServer$$anon$2.load(HistoryServer.scala:56)
at
org.spark-project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
at
org.spark-project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
at
org.spark-project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
at
org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
at org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
at
org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
at
org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
at
org.apache.spark.deploy.history.HistoryServer.getSparkUI(HistoryServer.scala:118)
at
org.apache.spark.status.api.v1.UIRoot$class.withSparkUI(ApiRootResource.scala:226)
at
org.apache.spark.deploy.history.HistoryServer.withSparkUI(HistoryServer.scala:46)
at
org.apache.spark.status.api.v1.ApiRootResource.getJobs(ApiRootResource.scala:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.invokeSubLocator(SubLocatorRule.java:180)
at
com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:107)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
at
com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
at
com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1469)
at
com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1400)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1349)
at
com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1339)
at
com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:416)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
at
com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:699)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at
org.spark-project.jetty.servlet.ServletHolder.handle(ServletHolder.java:684)
at
org.spark-project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:501)
at
org.spark-project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1086)
at
org.spark-project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:428)
at
org.spark-project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1020)
at
org.spark-project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at
org.spark-project.jetty.server.handler.GzipHandler.handle(GzipHandler.java:264)
at
org.spark-project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)
at
org.spark-project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.spark-project.jetty.server.Server.handle(Server.java:370)
at
org.spark-project.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)
at
org.spark-project.jetty.server.AbstractHttpConnection.headerComplete(AbstractHttpConnection.java:971)
at
org.spark-project.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete(AbstractHttpConnection.java:1033)
at
org.spark-project.jetty.http.HttpParser.parseNext(HttpParser.java:644)
at
org.spark-project.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at
org.spark-project.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at
org.spark-project.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:667)
at
org.spark-project.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
at
org.spark-project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at
org.spark-project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Thread.java:745)
------------------------------
Thanks,
B
On Mon, Mar 14, 2016 at 12:23 PM, Ted Yu <[email protected]> wrote:
> Which Spark release do you use ?
>
> For NoSuchElementException, was there anything else in the stack trace ?
>
> Thanks
>
> On Mon, Mar 14, 2016 at 12:12 PM, Boric Tan <[email protected]>
> wrote:
>
>> Hi there,
>>
>> I was trying to access application information with REST API. Looks like
>> the
>> top application information can be retrieved successfully, as shown below.
>> But jobs/stages information cannot be retrieved; an exception was
>> returned.
>> Any one has any ideas on how to fix it? Thanks!
>>
>> Top Application information retrieval: Passed
>>
>> URL:
>> http://bdg-master:18080/api/v1/applications/application_1457544696648_0002
>>
>> RESPONSE:
>>
>> {
>> "id" : "application_1457544696648_0002",
>> "name" : "Spark Pi",
>> "attempts" : [ {
>> "attemptId" : "1",
>> "startTime" : "2016-03-14T16:17:50.650GMT",
>> "endTime" : "2016-03-14T16:18:37.202GMT",
>> "sparkUser" : "bdguser",
>> "completed" : true
>> } ]
>> }
>>
>> Application job/stage information retrieval: Failed
>>
>> URL:
>>
>> http://bdg-master:18080/api/v1/applications/application_1457544696648_0002/jobs
>>
>> RESPONSE:
>> HTTP ERROR 500
>>
>> Problem accessing
>> /api/v1/applications/application_1457544696648_0002/jobs.
>> Reason:
>>
>> Server Error
>>
>> Caused by:
>>
>> org.spark-project.guava.util.concurrent.UncheckedExecutionException:
>> java.util.NoSuchElementException: no app with key
>> application_1457544696648_0002
>> at
>> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2263)
>> at
>> org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
>> at
>> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
>> at
>>
>> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>>
>>
>> Thanks,
>> B
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-when-accessing-Spark-metrics-with-REST-API-tp26487.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [email protected]
>> For additional commands, e-mail: [email protected]
>>
>>
>