srowen commented on a change in pull request #23263: [SPARK-23674][ML] Adds
Spark ML Events to Instrumentation
URL: https://github.com/apache/spark/pull/23263#discussion_r248654526
##########
File path: mllib/src/main/scala/org/apache/spark/ml/Pipeline.scala
##########
@@ -197,10 +200,12 @@ object Pipeline extends MLReadable[Pipeline] {
@Since("1.6.0")
override def load(path: String): Pipeline = super.load(path)
- private[Pipeline] class PipelineWriter(instance: Pipeline) extends MLWriter {
+ private[Pipeline] class PipelineWriter(val instance: Pipeline) extends
MLWriter {
SharedReadWrite.validateStages(instance.getStages)
+ override def save(path: String): Unit =
+ instrumented(_.withSaveInstanceEvent(this, path, logging =
true)(super.save(path)))
Review comment:
I see, yes I think it's good to emit log statements by default to match
`Instrumentation`, and think it's always fine to emit a debug log. I had
imagined that maybe a default implementation of `MLEvents` calls `logInfo` or
similar for these events, and then anyone can provide a different
implementation. Or, always logging at debug level no matter what the
implementation also seems OK.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]