Github user kayousterhout commented on the pull request:

    https://github.com/apache/spark/pull/5635#issuecomment-95299848
  
    It makes me a little nervous that there's now a time gap between 
deserializeEndTime and when taskStartTime gets calculated.  This *should* be 
very small (there's just the intermediate call to updateEpoch) but sometimes 
things like this get bigger over time (as code changes etc.), and that will 
make the metrics very confusing.  Can the task class expose 
executorDeserializeTime, and then Executor.scala can call that at the end to 
appropriately set all of the metrics?  I also slightly prefer that approach 
because it consolidates the metric setting to be mostly in Executor.scala.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to