Which Spark release do you use ?

For NoSuchElementException, was there anything else in the stack trace ?

Thanks

On Mon, Mar 14, 2016 at 12:12 PM, Boric Tan <it.news.tre...@gmail.com>
wrote:

> Hi there,
>
> I was trying to access application information with REST API. Looks like
> the
> top application information can be retrieved successfully, as shown below.
> But jobs/stages information cannot be retrieved; an exception was returned.
> Any one has any ideas on how to fix it? Thanks!
>
> Top Application information retrieval: Passed
>
> URL:
> http://bdg-master:18080/api/v1/applications/application_1457544696648_0002
>
> RESPONSE:
>
> {
>   "id" : "application_1457544696648_0002",
>   "name" : "Spark Pi",
>   "attempts" : [ {
>     "attemptId" : "1",
>     "startTime" : "2016-03-14T16:17:50.650GMT",
>     "endTime" : "2016-03-14T16:18:37.202GMT",
>     "sparkUser" : "bdguser",
>     "completed" : true
>   } ]
> }
>
> Application job/stage information retrieval: Failed
>
> URL:
>
> http://bdg-master:18080/api/v1/applications/application_1457544696648_0002/jobs
>
> RESPONSE:
> HTTP ERROR 500
>
> Problem accessing /api/v1/applications/application_1457544696648_0002/jobs.
> Reason:
>
>     Server Error
>
> Caused by:
>
> org.spark-project.guava.util.concurrent.UncheckedExecutionException:
> java.util.NoSuchElementException: no app with key
> application_1457544696648_0002
>         at
> org.spark-project.guava.cache.LocalCache$Segment.get(LocalCache.java:2263)
>         at
> org.spark-project.guava.cache.LocalCache.get(LocalCache.java:4000)
>         at
> org.spark-project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
>         at
>
> org.spark-project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
>
>
> Thanks,
> B
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Exceptions-when-accessing-Spark-metrics-with-REST-API-tp26487.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to