Github user vanzin commented on the pull request:
https://github.com/apache/spark/pull/654#issuecomment-52271303
Yes, that's my main concern with putting that logic in the logger. Also,
doing that means that another listener that wants to use a timestamp would have
to do the same hack; so if using that approach, it would be better to change
the SparkListener interface to take a timestamp parameter aside from the events.
I like having the timestamp with the event better because of that; it's
ultimately a property of the event, and placing it somewhere else means you
need to handle two pieces of information everywhere you need them, instead of
everything being encapsulated in the event itself.
I agree all events should have a timestamp, but I don't have a solution for
easily doing that (well, aside from investigating a saner approach to
generating JSON than manually writing the conversion code like it's currently
done). Perhaps it would be better soon to just switch to using something like
Jackson and keep the current code just for backwards compatibility with older
event logs.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]