Hi Everyone, I was not able to get Metrics in files separately using Slf4jsink with below configuration in *metrics.properties* :-
*# Enable Slf4jSink for all instances by class name* **.sink.slf4j.class=org.apache.spark.metrics.sink.Slf4jSink* *# Polling period for Slf4JSink* **.sink.slf4j.period=1* **.sink.slf4j.unit=minutes* It only prints in root logger file. After making below changes in Slf4jsink.scala , i was able to get metrics using log4j.properties in separate file :- Slf4jsink.scala (only changes):- *import org.slf4j.Logger* *import org.slf4j.LoggerFactory* *val reporter: Slf4jReporter = Slf4jReporter.forRegistry(registry).outputTo(LoggerFactory.getLogger("org.apache.spark.metrics"))* * .convertDurationsTo(TimeUnit.MILLISECONDS)* * .convertRatesTo(TimeUnit.SECONDS)* * .build()* log4j.properties (only changes) :- log4j.logger.org.apache.spark.metrics=INFO, metricFileAppender log4j.additivity.org.apache.spark.metrics=true log4j.appender.metricFileAppender=org.apache.log4j.RollingFileAppender log4j.appender.metricFileAppender.File=logs/metric.log log4j.appender.metricFileAppender.MaxFileSize=10MB log4j.appender.metricFileAppender.MaxBackupIndex=10 log4j.appender.metricFileAppender.layout=org.apache.log4j.PatternLayout log4j.appender.metricFileAppender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n Am i using wrong configuration or there is something missing in Slf4jsink.scala <https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/metrics/sink/Slf4jSink.scala#L51> There is one more bug :- Spark is printing metric as log twice in file or on console(if you root logger is enable) with or without above mention changes. Thanks, Mihir Monani