Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10835#discussion_r50778462
  
    --- Diff: project/MimaExcludes.scala ---
    @@ -145,6 +145,15 @@ object MimaExcludes {
             // SPARK-12510 Refactor ActorReceiver to support Java
             
ProblemFilters.exclude[AbstractClassProblem]("org.apache.spark.streaming.receiver.ActorReceiver")
           ) ++ Seq(
    +        // SPARK-12895 Implement TaskMetrics using accumulators
    +        
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.TaskContext.internalMetricsToAccumulators"),
    +        
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.TaskContext.collectInternalAccumulators"),
    +        
ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.TaskContext.collectAccumulators")
    +      ) ++ Seq(
    +        // SPARK-12896 Send only accumulator updates to driver, not 
TaskMetrics
    +        
ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.Accumulable.this"),
    --- End diff --
    
    One annoying limitation of MiMa is that I think this will exclude _any_ 
update to the constructors that we make during the 2.x cycle, so we'll have to 
be careful to not accidentally break binary compatibility for the other 
constructors.
    
    Note to self in review: make sure that we preserved compatibility for old 
public constructors here.
    
    @andrewor14, was this exclude because the `private` constructor wasn't 
appropriately ignored by MiMa?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to