Ryan Williams created SPARK-5778:
------------------------------------
Summary: Throw if nonexistent "spark.metrics.conf" file is provided
Key: SPARK-5778
URL: https://issues.apache.org/jira/browse/SPARK-5778
Project: Spark
Issue Type: Improvement
Components: Spark Core
Affects Versions: 1.2.1
Reporter: Ryan Williams
Priority: Minor
Spark looks for a {{MetricsSystem}} configuration file when the
{{spark.metrics.conf}} parameter is set, [defaulting to the path
"{{metrics.properties}}" when it's not
set|https://github.com/apache/spark/blob/466b1f671b21f575d28f9c103f51765790914fe3/core/src/main/scala/org/apache/spark/metrics/MetricsConfig.scala#L52-L55].
In the event of a failure to find or parse the file, [the exception is caught
and an error is
logged|https://github.com/apache/spark/blob/466b1f671b21f575d28f9c103f51765790914fe3/core/src/main/scala/org/apache/spark/metrics/MetricsConfig.scala#L61].
This seems like reasonable behavior in the general case where the user has not
specified a {{spark.metrics.conf}} file. However, I've been bitten several
times by having specified a file that all or some executors did not have
present (I typo'd the path, or forgot to add an additional {{--files}} flag to
make my local metrics config file get shipped to all executors), the error was
swallowed and I was confused about why I'd captured no metrics from a job that
appeared to have run successfully.
I'd like to change the behavior to actually throw if the user has specified a
configuration file that doesn't exist.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]