hi everyone
I have done the coding and create the PR
the implementation is straightforward and similar to the api in spark-core
but we still need someone with streaming background to verify the patch
just to make sure everything is OK
so, please anyone can help?
https://github.com/apache/spark/p
Thank you
this should take me at least a few days, and will let you know as soon
as the PR ready.
On 11/8/16 11:44 AM, Tathagata Das wrote:
This may be a good addition. I suggest you read our guidelines on
contributing code to Spark.
https://cwiki.apache.org/confluence/display/SPARK/Contri
This may be a good addition. I suggest you read our guidelines on
contributing code to Spark.
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-PreparingtoContributeCodeChanges
Its long document but it should have everything for you to figure out how
to c
hi everyone
it seems that there is not much who interested in creating a api for
Streaming.
never the less I still really want the api for monitoring.
so i tried to see if i can implement by my own.
after some test,
i believe i can achieve the goal by
1. implement a package(org.apache.spark.st
Hi everyone,
Trying to monitoring our streaming application using Spark REST interface
only to found that there is no such thing for Streaming.
I wonder if anyone already working on this or I should just start
implementing my own one?
--
BR
Peter Chan
---
Cool, great job☺.
Thanks
Jerry
From: Ryan Williams [mailto:ryan.blake.willi...@gmail.com]
Sent: Thursday, February 26, 2015 6:11 PM
To: user; dev@spark.apache.org
Subject: Monitoring Spark with Graphite and Grafana
If anyone is curious to try exporting Spark metrics to Graphite, I just
If anyone is curious to try exporting Spark metrics to Graphite, I just
published a post about my experience doing that, building dashboards in
Grafana <http://grafana.org/>, and using them to monitor Spark jobs:
http://www.hammerlab.org/2015/02/27/monitoring-spark-with-graphite-and-grafana/
nd ask a few questions about the present/future plans
for monitoring Spark jobs.
In rough order of increasing scope:
- Do most people monitor their Spark jobs in realtime by repeatedly
refreshing the web UI (cf. SPARK-5106
<https://issues.apache.org/jira/browse/SPARK-5106>), or is ther
hello,
im running spark on a cluster and i want to monitor how many nodes/ cores
are active in different (specific) points of the program.
is there any way to do this?
thanks,
Isca