Github user ryan-williams commented on the pull request:

    https://github.com/apache/spark/pull/4632#issuecomment-85180665
  
    Thanks @pwendell. I had stumbled across that 
[SPARK-3377](https://issues.apache.org/jira/browse/SPARK-3377) work as well.
    
    I think there are solid arguments for each of these use-cases being 
supported:
    
    * `app.id`-prefixing can be pathologically hard on Graphite's disk I/O / 
for short-running jobs.
    * `app.name`-prefixing is no good if you have jobs running simultaneously.
    
    Here are three options for supporting both (all defaulting to `app.id` but 
providing an escape hatch):
    
    1. Only admit `id` and `name` values here, and use the value from the 
appropriate config key. The main downside is that we would essentially 
introduce two new, made-up "magic strings" to do this; "id" and "name"? 
"app.id" and "app.name"? At that point, we're basically at…
    2. Allow usage of any existing conf value as the metrics prefix, which is 
what this PR currently does.
    3. Default to `app.id` but allow the user to specify a string that is used 
as the metrics' prefix (as opposed to a string that keys into `SparkConfig`), 
e.g. `--conf spark.metrics.prefix=my-app-name`;
        * this could be a `--conf` param or happen in the `MetricsConfig`'s 
file.
    
    I feel like doing this via the `MetricsConfig`'s `spark.metrics.conf` file 
makes more sense than adding another `--conf` param, but I could be persuaded 
otherwise.
    
    > It seems a bit weird to hard code handling of this particular 
configuration in the MetricsConfig class.
    
    This bit I disagree with; plenty of config params are {read by, relevant 
to} just one class.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to