Re: Monitor Spark Applications

2019-09-15 Thread Alex Landa
Hi Raman,

The banzaicloud jar can also cover the JMX exports.

Thanks,
Alex

On Fri, Sep 13, 2019 at 8:46 AM raman gugnani 
wrote:

> Hi Alex,
>
> Thanks will check this out.
>
> Can it be done directly as spark also exposes the  metrics  or JVM. In
> this my one doubt is how to assign fixed JMX ports to driver and executors.
>
> @Alex,
> Is there any difference in fetching data via JMX or using banzaicloud jar.
>
>
> On Fri, 13 Sep 2019 at 10:47, Alex Landa  wrote:
>
>> Hi,
>> We are starting to use https://github.com/banzaicloud/spark-metrics .
>> Keep in mind that their solution is for Spark for K8s, to make it work
>> for Spark on Yarn you have to copy the dependencies of the spark-metrics
>> into Spark Jars folders on all the Spark machines (took me a while to
>> figure).
>>
>> Thanks,
>> Alex
>>
>> On Fri, Sep 13, 2019 at 7:58 AM raman gugnani 
>> wrote:
>>
>>> Hi Team,
>>>
>>> I am new to spark. I am using spark on hortonworks dataplatform with
>>> amazon EC2 machines. I am running spark in cluster mode with yarn.
>>>
>>> I need to monitor individual JVMs and other Spark metrics with
>>> *prometheus*.
>>>
>>> Can anyone suggest the solution to do the same.
>>>
>>> --
>>> Raman Gugnani
>>>
>>
>
> --
> Raman Gugnani
>


Re: Monitor Spark Applications

2019-09-12 Thread raman gugnani
Hi Alex,

Thanks will check this out.

Can it be done directly as spark also exposes the  metrics  or JVM. In this
my one doubt is how to assign fixed JMX ports to driver and executors.

@Alex,
Is there any difference in fetching data via JMX or using banzaicloud jar.


On Fri, 13 Sep 2019 at 10:47, Alex Landa  wrote:

> Hi,
> We are starting to use https://github.com/banzaicloud/spark-metrics .
> Keep in mind that their solution is for Spark for K8s, to make it work for
> Spark on Yarn you have to copy the dependencies of the spark-metrics into
> Spark Jars folders on all the Spark machines (took me a while to figure).
>
> Thanks,
> Alex
>
> On Fri, Sep 13, 2019 at 7:58 AM raman gugnani 
> wrote:
>
>> Hi Team,
>>
>> I am new to spark. I am using spark on hortonworks dataplatform with
>> amazon EC2 machines. I am running spark in cluster mode with yarn.
>>
>> I need to monitor individual JVMs and other Spark metrics with
>> *prometheus*.
>>
>> Can anyone suggest the solution to do the same.
>>
>> --
>> Raman Gugnani
>>
>

-- 
Raman Gugnani


Re: Monitor Spark Applications

2019-09-12 Thread Alex Landa
Hi,
We are starting to use https://github.com/banzaicloud/spark-metrics .
Keep in mind that their solution is for Spark for K8s, to make it work for
Spark on Yarn you have to copy the dependencies of the spark-metrics into
Spark Jars folders on all the Spark machines (took me a while to figure).

Thanks,
Alex

On Fri, Sep 13, 2019 at 7:58 AM raman gugnani 
wrote:

> Hi Team,
>
> I am new to spark. I am using spark on hortonworks dataplatform with
> amazon EC2 machines. I am running spark in cluster mode with yarn.
>
> I need to monitor individual JVMs and other Spark metrics with
> *prometheus*.
>
> Can anyone suggest the solution to do the same.
>
> --
> Raman Gugnani
>