Thanks, that worked

On Mon, Sep 14, 2015 at 4:54 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> You can write a script to hit the MasterURL:8080/json end point to
> retrieve the information. It gives you  a response like this:
>
>
> {
>   "url" : "spark://akhldz:7077",
>   "workers" : [ {
>     "id" : "worker-20150914165233-0.0.0.0-40324",
>     "host" : "0.0.0.0",
>     "port" : 40324,
>     "webuiaddress" : "http://0.0.0.0:8081";,
>     "cores" : 4,
>     "coresused" : 0,
>     "coresfree" : 4,
>     "memory" : 2893,
>     "memoryused" : 0,
>     "memoryfree" : 2893,
>     "state" : "ALIVE",
>     "lastheartbeat" : 1442229774880
>   } ],
>   "cores" : 4,
>   "coresused" : 0,
>   "memory" : 2893,
>   "memoryused" : 0,
>   "activeapps" : [ ],
>   "completedapps" : [ ],
>   "activedrivers" : [ ],
>   "status" : "ALIVE"
> }
>
>
> Thanks
> Best Regards
>
> On Fri, Sep 11, 2015 at 11:46 PM, prk77 <pratham....@gmail.com> wrote:
>
>> Is there a way to fetch the current spark cluster memory & cpu usage
>> programmatically ?
>> I know that the default spark master web ui has these details but I want
>> to
>> retrieve them through a program and store them for analysis.
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-monitoring-tp24660.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to