Re: IDE suitable for Spark : Monitoring & Debugging Spark Jobs

2020-04-07 Thread Som Lima
The definitive guide Chapter 18: Monitoring and Debugging "This chapter covers the key details you need to monitor and debug your Spark Applications. To do this , we will walk through the spark UI with an example query designed to help you understand how to trace your own jobs through the execut

Retrieve batch metadata via the spark monitoring api

2018-02-13 Thread Hendrik Dev
I use Spark 2.2.1 with streaming and when i open the Spark Streaming UI i can see input metadata for each of my batches. In my case i stream from Kafka and in the metadata section i find useful informations about my topic, partitions and offsets. Assume the url for this batch looks like http://loc

Re: Spark Monitoring using Jolokia

2018-01-09 Thread Gourav Sengupta
Hi, I am totally confused here, may be because I do not exactly understand this, but why is this required? I have always used SPARK UI and found that more than sufficient. And if you know a bit about how SPARK session works then your performance does have a certain degree of predictability as well

Re: Spark Monitoring using Jolokia

2018-01-08 Thread Thakrar, Jayesh
And here's some more info on Spark Metrics https://www.slideshare.net/JayeshThakrar/apache-bigdata2017sparkprofiling From: Maximiliano Felice Date: Monday, January 8, 2018 at 8:14 AM To: Irtiza Ali Cc: Subject: Re: Spark Monitoring using Jolokia Hi! I don't know very much about

Re: Spark Monitoring using Jolokia

2018-01-08 Thread Maximiliano Felice
Hi! I don't know very much about them, but I'm currently working in posting custom metrics into Graphite. I found useful the internals described in this library: https://github.com/groupon/spark-metrics Hope this at least can give you a hint. Best of lu

Spark Monitoring using Jolokia

2018-01-08 Thread Irtiza Ali
Hello everyone, I am building a monitoring tool for the spark, for that I needs sparks metrics. I am using jolokia to get the metrics. I have a question that: Can I get all the metrics provided by the spark rest api using the Jolokia? How the spark rest api get the metrics internally? Thanks

Spark Monitoring to get Spark GCs and records processed

2015-11-18 Thread rakesh rakshit
Hi all, I want to monitor Spark to get the following: 1. All the GC stats for Spark JVMs 2. Records successfully processed in a batch 3. Records failed in a batch 4. Getting historical data for batches,jobs,stages,tasks,etc, Please let me know how can I get these information in Spark. Regards,

Re: Spark monitoring

2015-09-17 Thread Pratham Khanna
> Thanks > Best Regards > > On Fri, Sep 11, 2015 at 11:46 PM, prk77 wrote: > >> Is there a way to fetch the current spark cluster memory & cpu usage >> programmatically ? >> I know that the default spark master web ui has these details but I want >> to >> re

Spark monitoring

2015-09-11 Thread prk77
list.1001560.n3.nabble.com/Spark-monitoring-tp24660.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: us

Re: Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-04 Thread Srini Karri
Hi Marcelo, I found the problem from http://mail-archives.apache.org/mod_mbox/spark-user/201409.mbox/%3cCAL+LEBfzzjugOoB2iFFdz_=9TQsH=DaiKY=cvydfydg3ac5...@mail.gmail.com%3e this link. The problem is the application I am running, is not generating "APPLICATION_COMPLETE" file. If I add this file ma

Re: Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-04 Thread Srini Karri
Yes. I do see files, actually I missed copying the other settings: spark.master spark:// skarri-lt05.redmond.corp.microsoft.com:7077 spark.eventLog.enabled true spark.rdd.compress true spark.storage.memoryFraction 1 spark.core.connection.ack.wait.timeout 6000 spark.ak

Re: Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-04 Thread Marcelo Vanzin
On Wed, Mar 4, 2015 at 10:08 AM, Srini Karri wrote: > spark.executor.extraClassPath > D:\\Apache\\spark-1.2.1-bin-hadoop2\\spark-1.2.1-bin-hadoop2.4\\bin\\classes > spark.eventLog.dir > D:/Apache/spark-1.2.1-bin-hadoop2/spark-1.2.1-bin-hadoop2.4/bin/tmp/spark-events > spark.history.fs.logDirectory

Re: Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-04 Thread Srini Karri
Hi Todd and Marcelo, Thanks for helping me. I was to able to lunch the history server on windows with out any issues. One problem I am running into right now. I always get the message no completed applications found in history server UI. But I was able to browse through these applications from Spa

Re: Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-03 Thread Marcelo Vanzin
Spark applications shown in the RM's UI should have an "Application Master" link when they're running. That takes you to the Spark UI for that application where you can see all the information you're looking for. If you're running a history server and add "spark.yarn.historyServer.address" to your

Re: Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-03 Thread Todd Nist
Hi Srini, If you start the $SPARK_HOME/sbin/start-history-server, you should be able to see the basic spark ui. You will not see the master, but you will be able to see the rest as I recall. You also need to add an entry into the spark-defaults.conf, something like this: *## Make sure the host

Spark Monitoring UI for Hadoop Yarn Cluster

2015-03-03 Thread Srini Karri
Hi All, I am having trouble finding data related to my requirement. Here is the context, I have tried Standalone Spark Installation on Windows, I am able to submit the logs, able to see the history of events. My question is, is it possible to achieve the same monitoring UI experience with Yarn Clu

Re: Spark Monitoring with Ganglia

2014-10-08 Thread Otis Gospodnetic
Hi, If using Ganglia is not an absolute requirement, check out SPM <http://sematext.com/spm/> for Spark -- http://blog.sematext.com/2014/10/07/apache-spark-monitoring/ It monitors all Spark metrics (i.e. you don't need to figure out what you need to monitor, how to get it, how to gr

Re: Spark Monitoring with Ganglia

2014-10-05 Thread manasdebashiskar
t works well > > -- > If you reply to this email, your message will be added to the discussion > below: > > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Monitoring-with-Ganglia-tp15538p15705.html > To start a new topic under Apache Spark User List,

Re: Spark Monitoring with Ganglia

2014-10-03 Thread TANG Gen
Maybe you can follow the instruction in this link https://github.com/mesos/spark-ec2/tree/v3/ganglia <https://github.com/mesos/spark-ec2/tree/v3/ganglia> . For me it works well -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Monitoring-with-G

Spark Monitoring with Ganglia

2014-10-01 Thread danilopds
Hi, I need monitoring some aspects about my cluster like network and resources. Ganglia looks like a good option for what I need. Then, I found out that Spark has support to Ganglia. On the Spark monitoring webpage there is this information: "To install the GangliaSink you’ll need to perf