Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/1067#discussion_r13700626
--- Diff: core/src/main/scala/org/apache/spark/metrics/sink/CsvSink.scala
---
@@ -53,11 +53,14 @@ private[spark] class CsvSink(val property: Properties,
val registry: MetricRegis
case None => CSV_DEFAULT_DIR
}
+ val file= new File(pollDir + conf.get("spark.app.uniqueName"))
--- End diff --
Hi @rahulsinghaliitd , do we really need to use `SparkConf` to get this
unique app name? I think metrics system is driven by its own configuration
system, here you involved `SparkConf` as another configuration system just only
for `CsvSink`, this will give people the chance to bypass the original way and
lose its controllability. I think it would be nice to modify the way to get
`appUniqueName` (eg. get it from SparkEnv).
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---