Github user rxin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23207#discussion_r238837000
  
    --- Diff: core/src/main/scala/org/apache/spark/shuffle/metrics.scala ---
    @@ -50,3 +50,57 @@ private[spark] trait ShuffleWriteMetricsReporter {
       private[spark] def decBytesWritten(v: Long): Unit
       private[spark] def decRecordsWritten(v: Long): Unit
     }
    +
    +
    +/**
    + * A proxy class of ShuffleWriteMetricsReporter which proxy all metrics 
updating to the input
    + * reporters.
    + */
    +private[spark] class GroupedShuffleWriteMetricsReporter(
    +    reporters: Seq[ShuffleWriteMetricsReporter]) extends 
ShuffleWriteMetricsReporter {
    +  override private[spark] def incBytesWritten(v: Long): Unit = {
    +    reporters.foreach(_.incBytesWritten(v))
    +  }
    +  override private[spark] def decRecordsWritten(v: Long): Unit = {
    +    reporters.foreach(_.decRecordsWritten(v))
    +  }
    +  override private[spark] def incRecordsWritten(v: Long): Unit = {
    +    reporters.foreach(_.incRecordsWritten(v))
    +  }
    +  override private[spark] def incWriteTime(v: Long): Unit = {
    +    reporters.foreach(_.incWriteTime(v))
    +  }
    +  override private[spark] def decBytesWritten(v: Long): Unit = {
    +    reporters.foreach(_.decBytesWritten(v))
    +  }
    +}
    +
    +
    +/**
    + * A proxy class of ShuffleReadMetricsReporter which proxy all metrics 
updating to the input
    + * reporters.
    + */
    +private[spark] class GroupedShuffleReadMetricsReporter(
    --- End diff --
    
    Again - I think your old approach is much better. No point creating a 
general util when there is only one implementation without any known future 
needs.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to