RE: JMX with Spark

2015-11-05 Thread Liu shen
Hi,
This article may help you. Expose your counter through akka actor
https://tersesystems.com/2014/08/19/exposing-akka-actor-state-with-jmx/

Sent from Mail for Windows 10



From: Yogesh Vyas
Sent: 2015年11月5日 21:21
To: Romi Kuntsman
Cc: user@spark.apache.org
Subject: Re: JMX with Spark


Hi,
Please let me elaborate my question so that you will get to know what
exactly I want.

I am running a Spark Streaming job. This job is to count number of
occurrence of the event. Right now I am using a key/value pair RDD
which tells me the count of an event, where key is the event and value
is the number of counts. What I want to is to create a web based
monitoring control system, which will get connected to the MBean
Server and the count value will be displayed on the monitoring system
as it changes.

On Thu, Nov 5, 2015 at 5:54 PM, Romi Kuntsman  wrote:
> Have you read this?
> https://spark.apache.org/docs/latest/monitoring.html
>
> Romi Kuntsman, Big Data Engineer
> http://www.totango.com
>
> On Thu, Nov 5, 2015 at 2:08 PM, Yogesh Vyas  wrote:
>>
>> Hi,
>> How we can use JMX and JConsole to monitor our Spark applications?
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org




Re: JMX with Spark

2015-11-05 Thread Yogesh Vyas
Hi,
Please let me elaborate my question so that you will get to know what
exactly I want.

I am running a Spark Streaming job. This job is to count number of
occurrence of the event. Right now I am using a key/value pair RDD
which tells me the count of an event, where key is the event and value
is the number of counts. What I want to is to create a web based
monitoring control system, which will get connected to the MBean
Server and the count value will be displayed on the monitoring system
as it changes.

On Thu, Nov 5, 2015 at 5:54 PM, Romi Kuntsman  wrote:
> Have you read this?
> https://spark.apache.org/docs/latest/monitoring.html
>
> Romi Kuntsman, Big Data Engineer
> http://www.totango.com
>
> On Thu, Nov 5, 2015 at 2:08 PM, Yogesh Vyas  wrote:
>>
>> Hi,
>> How we can use JMX and JConsole to monitor our Spark applications?
>>
>> -
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: JMX with Spark

2015-11-05 Thread Romi Kuntsman
Have you read this?
https://spark.apache.org/docs/latest/monitoring.html

*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com

On Thu, Nov 5, 2015 at 2:08 PM, Yogesh Vyas  wrote:

> Hi,
> How we can use JMX and JConsole to monitor our Spark applications?
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>


JMX with Spark

2015-11-05 Thread Yogesh Vyas
Hi,
How we can use JMX and JConsole to monitor our Spark applications?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: JMX with Spark

2014-04-25 Thread Paul Schooss
 all instances
#*.sink.csv.class=org.apache.spark.metrics.sink.CsvSink

# Polling period for CsvSink
#*.sink.csv.period=1

#*.sink.csv.unit=minutes

# Polling directory for CsvSink
#*.sink.csv.directory=/tmp/

# Worker instance overlap polling period
#worker.sink.csv.period=10

#worker.sink.csv.unit=minutes

# Enable jvm source for instance master, worker, driver and executor
master.source.jvm.class=org.apache.spark.metrics.source.JvmSource

worker.source.jvm.class=org.apache.spark.metrics.source.JvmSource

driver.source.jvm.class=org.apache.spark.metrics.source.JvmSource

executor.source.jvm.class=org.apache.spark.metrics.source.JvmSource



On Fri, Apr 25, 2014 at 4:20 AM, Ravi Hemnani wrote:

> Can you share your working metrics.properties.?
>
> I want remote jmx to be enabled so i need to use the JMXSink and monitor my
> spark master and workers.
>
> But what are the parameters that are to be defined like host and port ?
>
> So your config can help.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/JMX-with-Spark-tp4309p4823.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>


RE: JMX with Spark

2014-04-25 Thread Ravi Hemnani
Can you share your working metrics.properties.?

I want remote jmx to be enabled so i need to use the JMXSink and monitor my
spark master and workers. 

But what are the parameters that are to be defined like host and port ? 

So your config can help. 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/JMX-with-Spark-tp4309p4823.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.


RE: JMX with Spark

2014-04-15 Thread Shao, Saisai
Hi Paul, would you please paste your metrics.conf out so that we can find the 
problems if you still have problems.

Thanks
Jerry

From: Parviz Deyhim [mailto:pdey...@gmail.com]
Sent: Wednesday, April 16, 2014 9:10 AM
To: user@spark.apache.org
Subject: Re: JMX with Spark

home directory or $home/conf directory? works for me with metrics.properties 
hosted under conf dir.

On Tue, Apr 15, 2014 at 6:08 PM, Paul Schooss 
mailto:paulmscho...@gmail.com>> wrote:
Has anyone got this working? I have enabled the properties for it in the 
metrics.conf file and ensure that it is placed under spark's home directory. 
Any ideas why I don't see spark beans ?



Re: JMX with Spark

2014-04-15 Thread Parviz Deyhim
home directory or $home/conf directory? works for me with
metrics.properties hosted under conf dir.


On Tue, Apr 15, 2014 at 6:08 PM, Paul Schooss wrote:

> Has anyone got this working? I have enabled the properties for it in the
> metrics.conf file and ensure that it is placed under spark's home
> directory. Any ideas why I don't see spark beans ?
>


JMX with Spark

2014-04-15 Thread Paul Schooss
Has anyone got this working? I have enabled the properties for it in the
metrics.conf file and ensure that it is placed under spark's home
directory. Any ideas why I don't see spark beans ?