fsamuel-bs commented on a change in pull request #29977:
URL: https://github.com/apache/spark/pull/29977#discussion_r502469786
##########
File path: core/src/main/scala/org/apache/spark/scheduler/Task.scala
##########
@@ -123,8 +125,12 @@ private[spark] abstract class Task[T](
Option(taskAttemptId),
Option(attemptNumber)).setCurrentContext()
+ plugins.foreach(_.onTaskStart())
Review comment:
That's what I documented on
https://github.com/apache/spark/pull/29977/files#diff-6a99ec9983962323b4e0c1899134b5d6R76-R78
-- argument that came to mind is that it's easy for a plugin dev to track some
state in a thread-local and clean decide if it wants to perform the
succeeded/failed action or not.
Happy to change it if we prefer not to put this burden on the plugin owner
though.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]