I installed ganglia, and I think it worked well for hadoop, hbase for I can
see hadoop/hbase metrics on ganglia-web.I want to use ganglia to monitor
spark. and I followed the steps as following:1) first I did a custom compile
with -Pspark-ganglia-lgpl, and it sucessed without
warnings../make-distribution.sh --tgz --skip-java-test -Phadoop-2.3 -Pyarn
-Phive -Pspark-ganglia-lgpl2)I configured the conf/metrics.properties:(8653
is the port I set for gmond) and restart spark Master and Workervi
conf/metrics.properties# Enable GangliaSink for all
instances*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink*.sink.ganglia.name=hadoop_cluster1*.sink.ganglia.host=localhost*.sink.ganglia.port=8653*.sink.ganglia.period=10*.sink.ganglia.unit=seconds*.sink.ganglia.ttl=1*.sink.ganglia.mode=multicastsbin/stop-all.shsbin/start-all.sh3)
I refreshed my ganglia-web,but I can not see any spark metrics.4) I made a
test to verify whether the sinks of ConsoleSink and CSVSink works OK, and
the result is OK, I found metrics in logs and *.sink.csv.directoryI searched
topic about "ganglia" and "metrics" on the
http://apache-spark-user-list.1001560.n3.nabble.com ,spark JIRA and google,
but found anything useful.Any one could give me a help or some proposal?



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Can-not-see-any-spark-metrics-on-ganglia-web-tp14981.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to