Re: Spark metrics when running with YARN?

2016-09-18 Thread Vladimir Tretyakov
spark.examples.SparkPi >>>>> --master >>>>> >>> spark://wawanawna:7077 --executor-memory 2G >>>>> --total-executor-cores 30 >>>>> >>> examples/jars/spark-examples_2.11-2.0.0.jar 1 >>>>

Re: Spark metrics when running with YARN?

2016-09-17 Thread Saisai Shao
1 >>>> >>> >>>> >>> Request to API endpoint: >>>> >>> >>>> >>> http://localhost:4040/api/v1/applications >>>> >>> >>>> >>> returned me following JSON: >>>> >>&

Re: Spark metrics when running with YARN?

2016-09-16 Thread Vladimir Tretyakov
uot;Spark Pi", >>> >>> "attempts" : [ { >>> >>> "startTime" : "2016-09-09T15:45:25.047GMT", >>> >>> "endTime" : "1969-12-31T23:59:59.999GMT", >>> >>> "la

Re: Spark metrics when running with YARN?

2016-09-12 Thread Vladimir Tretyakov
sparkUser" : "", >> >>> "completed" : false, >> >>> "startTimeEpoch" : 1473435925047, >> >>> "endTimeEpoch" : -1, >> >>> "lastUpdatedEpoch" : 1473435925047 >> >&

Re: Spark metrics when running with YARN?

2016-09-12 Thread Saisai Shao
: 1473435925047, > >>> "endTimeEpoch" : -1, > >>> "lastUpdatedEpoch" : 1473435925047 > >>> } ] > >>> } ] > >>> > >>> so response contains information only about 1 application. > >

Re: Spark metrics when running with YARN?

2016-09-11 Thread Jacek Laskowski
Epoch" : -1, >>> "lastUpdatedEpoch" : 1473435925047 >>> } ] >>> } ] >>> >>> so response contains information only about 1 application. >>> >>> But in reality I've started 2 applications and Spark UI shows me 2 >&

Re: Spark metrics when running with YARN?

2016-09-11 Thread Vladimir Tretyakov
2 applications and Spark UI shows me 2 RUNNING application >> (please see screenshot). Does anybody maybe know answer why API and UI >> shows different things? >> >> >> Best regards, Vladimir. >> >> >> On Tue, Aug 30, 2016 at 3:52 PM, Vi

Re: Spark metrics when running with YARN?

2016-09-09 Thread Jacek Laskowski
. > > > On Tue, Aug 30, 2016 at 3:52 PM, Vijay Kiran <m...@vijaykiran.com> wrote: > >> Hi Otis, >> >> Did you check the REST API as documented in >> http://spark.apache.org/docs/latest/monitoring.html >> >> Regards, >> Vijay >> >> >

Re: Spark metrics when running with YARN?

2016-08-30 Thread Vijay Kiran
ant bit - I'm looking for a > programmatic way to get Spark metrics when running Spark under YARN - so JMX > or API of some kind. > > Thanks, > Otis > -- > Monitoring - Log Management - Alerting - Anomaly Detection > Solr & Elasticsearch Consulting Support Training -

Re: Spark metrics when running with YARN?

2016-08-30 Thread Otis Gospodnetić
Hi Mich and Vijay, Thanks! I forgot to include an important bit - I'm looking for a *programmatic* way to get Spark metrics when running Spark under YARN - so JMX or API of some kind. Thanks, Otis -- Monitoring - Log Management - Alerting - Anomaly Detection Solr & Elasticsearch Consul

Re: Spark metrics when running with YARN?

2016-08-30 Thread Mich Talebzadeh
Spark UI regardless of deployment mode Standalone, yarn etc runs on port 4040 by default that can be accessed directly Otherwise one can specify a specific port with --conf "spark.ui.port=5" for example 5 HTH Dr Mich Talebzadeh LinkedIn *

Re: Spark metrics when running with YARN?

2016-08-30 Thread Vijay Kiran
From Yarm RM UI, find the spark application Id, and in the application details, you can click on the “Tracking URL” which should give you the Spark UI. ./Vijay > On 30 Aug 2016, at 07:53, Otis Gospodnetić wrote: > > Hi, > > When Spark is run on top of YARN,

Re: Spark metrics when running with YARN?

2016-08-30 Thread Mich Talebzadeh
Have you checked spark UI on port HOST:4040 by default? HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw *

Spark metrics when running with YARN?

2016-08-29 Thread Otis Gospodnetić
Hi, When Spark is run on top of YARN, where/how can one get Spark metrics? Thanks, Otis -- Monitoring - Log Management - Alerting - Anomaly Detection Solr & Elasticsearch Consulting Support Training - http://sematext.com/