I was wrong here.

I am using spark standalone cluster and I am not using YARN or MESOS. Is it
possible to track spark execution memory?.

On Mon, Oct 21, 2019 at 5:42 PM Sriram Ganesh <srigns...@gmail.com> wrote:

> I looked into this. But I found it is possible like this
>
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/status/AppStatusListener.scala#L229
>
> Line no 230. This is for executors.
>
> Just wanna cross verify is that right?
>
>
>
> On Mon, 21 Oct 2019, 17:24 Alonso Isidoro Roman, <alons...@gmail.com>
> wrote:
>
>> Take a look in this thread
>> <https://stackoverflow.com/questions/48768188/spark-execution-memory-monitoring#_=_>
>>
>> El lun., 21 oct. 2019 a las 13:45, Sriram Ganesh (<srigns...@gmail.com>)
>> escribió:
>>
>>> Hi,
>>>
>>> I wanna monitor how much memory executor and task used for a given job.
>>> Is there any direct method available for it which can be used to track this
>>> metric?
>>>
>>> --
>>> *Sriram G*
>>> *Tech*
>>>
>>>
>>
>> --
>> Alonso Isidoro Roman
>> [image: https://]about.me/alonso.isidoro.roman
>>
>> <https://about.me/alonso.isidoro.roman?promo=email_sig&utm_source=email_sig&utm_medium=email_sig&utm_campaign=external_links>
>>
>

-- 
*Sriram G*
*Tech*

Reply via email to