Re: Spark metrics for ganglia
Hi, How to verify whether the GangliaSink directory got created? Thanks, Swetha On Mon, Dec 15, 2014 at 11:29 AM, danilopds wrote: > Thanks tsingfu, > > I used this configuration based in your post: (with ganglia unicast mode) > # Enable GangliaSink for all instances > *.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink > *.sink.ganglia.host=10.0.0.7 > *.sink.ganglia.port=8649 > *.sink.ganglia.period=15 > *.sink.ganglia.unit=seconds > *.sink.ganglia.ttl=1 > *.sink.ganglia.mode=unicast > > Then, > I have the following error now. > ERROR metrics.MetricsSystem: Sink class > org.apache.spark.metrics.sink.GangliaSink cannot be instantialized > java.lang.ClassNotFoundException: org.apache.spark.metrics.sink.GangliaSink > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-metrics-for-ganglia-tp14335p20690.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Re: Spark metrics for ganglia
Did you get past this issue? I¹m trying to get this to work as well. You have to compile in the spark-ganglia-lgpl artifact into your application. org.apache.spark spark-ganglia-lgpl_2.10 ${project.version} So I added the above snippet to the examples project, and it finds the class now when I try to run the Pi example, but I get this problem instead: 14/12/24 11:47:23 ERROR metrics.MetricsSystem: Sink class org.apache.spark.metrics.sink.GangliaSink cannot be instantialized <ŠSNIPŠ> Caused by: java.lang.NumberFormatException: For input string: "1 " On 12/15/14, 11:29 AM, "danilopds" wrote: >Thanks tsingfu, > >I used this configuration based in your post: (with ganglia unicast mode) ># Enable GangliaSink for all instances >*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink >*.sink.ganglia.host=10.0.0.7 >*.sink.ganglia.port=8649 >*.sink.ganglia.period=15 >*.sink.ganglia.unit=seconds >*.sink.ganglia.ttl=1 >*.sink.ganglia.mode=unicast > >Then, >I have the following error now. >ERROR metrics.MetricsSystem: Sink class >org.apache.spark.metrics.sink.GangliaSink cannot be instantialized >java.lang.ClassNotFoundException: >org.apache.spark.metrics.sink.GangliaSink > > > > > >-- >View this message in context: >http://apache-spark-user-list.1001560.n3.nabble.com/Spark-metrics-for-gang >lia-tp14335p20690.html >Sent from the Apache Spark User List mailing list archive at Nabble.com. > >- >To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >For additional commands, e-mail: user-h...@spark.apache.org > - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Spark metrics for ganglia
Thanks tsingfu, I used this configuration based in your post: (with ganglia unicast mode) # Enable GangliaSink for all instances *.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink *.sink.ganglia.host=10.0.0.7 *.sink.ganglia.port=8649 *.sink.ganglia.period=15 *.sink.ganglia.unit=seconds *.sink.ganglia.ttl=1 *.sink.ganglia.mode=unicast Then, I have the following error now. ERROR metrics.MetricsSystem: Sink class org.apache.spark.metrics.sink.GangliaSink cannot be instantialized java.lang.ClassNotFoundException: org.apache.spark.metrics.sink.GangliaSink -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-metrics-for-ganglia-tp14335p20690.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Spark metrics for ganglia
Hello Samudrala, Did you solve this issue about view metrics in Ganglia?? Because I have the same problem. Thanks. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-metrics-for-ganglia-tp14335p20385.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org