RE: Spark Metrics : Why is the Sink class declared private[spark] ?
In the meantime you can simply define your custom metric source in the org.apache.spark package. From: Walid Lezzar<mailto:walez...@gmail.com> Sent: Saturday, April 2, 2016 4:23 AM To: Saisai Shao<mailto:sai.sai.s...@gmail.com> Cc: spark users<mailto:user@spark.apache.org> Subject: Re: Spark Metrics : Why is the Sink class declared private[spark] ? This is great ! Hope this jira will be resolved for the next version of spark Thanks. Le 2 avr. 2016 ? 01:07, Saisai Shao mailto:sai.sai.s...@gmail.com>> a ?crit : There's a JIRA (https://issues.apache.org/jira/browse/SPARK-14151) about it, please take a look. Thanks Saisai On Sat, Apr 2, 2016 at 6:48 AM, Walid Lezzar mailto:walez...@gmail.com>> wrote: Hi, I looked into the spark code at how spark report metrics using the MetricsSystem class. I've seen that the spark MetricsSystem class when instantiated parses the metrics.properties file, tries to find the sinks class name and load them dinamically. It would be great to implement my own sink by inheriting from the org.apache.spark.metrics.sinks.Sink class but unfortunately, this class has been declared private[spark] ! So it is not possible to inverit from it ! Why is that ? Is this gonna change in future spark versions ? - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org<mailto:user-unsubscr...@spark.apache.org> For additional commands, e-mail: user-h...@spark.apache.org<mailto:user-h...@spark.apache.org>
Re: Spark Metrics : Why is the Sink class declared private[spark] ?
This is great ! Hope this jira will be resolved for the next version of spark Thanks. > Le 2 avr. 2016 à 01:07, Saisai Shao a écrit : > > There's a JIRA (https://issues.apache.org/jira/browse/SPARK-14151) about it, > please take a look. > > Thanks > Saisai > >> On Sat, Apr 2, 2016 at 6:48 AM, Walid Lezzar wrote: >> Hi, >> >> I looked into the spark code at how spark report metrics using the >> MetricsSystem class. I've seen that the spark MetricsSystem class when >> instantiated parses the metrics.properties file, tries to find the sinks >> class name and load them dinamically. It would be great to implement my own >> sink by inheriting from the org.apache.spark.metrics.sinks.Sink class but >> unfortunately, this class has been declared private[spark] ! So it is not >> possible to inverit from it ! Why is that ? Is this gonna change in future >> spark versions ? >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >
Re: Spark Metrics : Why is the Sink class declared private[spark] ?
There's a JIRA (https://issues.apache.org/jira/browse/SPARK-14151) about it, please take a look. Thanks Saisai On Sat, Apr 2, 2016 at 6:48 AM, Walid Lezzar wrote: > Hi, > > I looked into the spark code at how spark report metrics using the > MetricsSystem class. I've seen that the spark MetricsSystem class when > instantiated parses the metrics.properties file, tries to find the sinks > class name and load them dinamically. It would be great to implement my own > sink by inheriting from the org.apache.spark.metrics.sinks.Sink class but > unfortunately, this class has been declared private[spark] ! So it is not > possible to inverit from it ! Why is that ? Is this gonna change in future > spark versions ? > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >
Spark Metrics : Why is the Sink class declared private[spark] ?
Hi, I looked into the spark code at how spark report metrics using the MetricsSystem class. I've seen that the spark MetricsSystem class when instantiated parses the metrics.properties file, tries to find the sinks class name and load them dinamically. It would be great to implement my own sink by inheriting from the org.apache.spark.metrics.sinks.Sink class but unfortunately, this class has been declared private[spark] ! So it is not possible to inverit from it ! Why is that ? Is this gonna change in future spark versions ? - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org