Lantao Jin shared an issue with you ------------------------------------
Hi all, Do you think is it a bug? Should we keep the current behavior still? > Ignore to load default properties file is not a good choice from the > perspective of system > ------------------------------------------------------------------------------------------ > > Key: SPARK-21023 > URL: https://issues.apache.org/jira/browse/SPARK-21023 > Project: Spark > Issue Type: Improvement > Components: Spark Submit > Affects Versions: 2.1.1 > Reporter: Lantao Jin > Priority: Minor > > The default properties file {{spark-defaults.conf}} shouldn't be ignore to > load even though the submit arg {{--properties-file}} is set. The reasons are > very easy to see: > * Infrastructure team need continually update the {{spark-defaults.conf}} > when they want set something as default for entire cluster as a tuning > purpose. > * Application developer only want to override the parameters they really want > rather than others they even doesn't know (Set by infrastructure team). > * The purpose of using {{\-\-properties-file}} from most of application > developers is to avoid setting dozens of {{--conf k=v}}. But if > {{spark-defaults.conf}} is ignored, the behaviour becomes unexpected finally. > For example: > Current implement > ||Property name||Value in default||Value in user-special||Finally value|| > |spark.A|"foo"|"bar"|"bar"| > |spark.B|"foo"|N/A|N/A| > |spark.C|N/A|"bar"|"bar"| > |spark.D|"foo"|"foo"|"foo"| > |spark.E|"foo"|N/A|N/A| > |spark.F|"foo"|N/A|N/A| > Expected right implement > ||Property name||Value in default||Value in user-special||Finally value|| > |spark.A|"foo"|"bar"|"bar"| > |spark.B|"foo"|N/A|"foo"| > |spark.C|N/A|"bar"|"bar"| > |spark.D|"foo"|"foo"|"foo"| > |spark.E|"foo"|"foo"|"foo"| > |spark.F|"foo"|"foo"|"foo"| > I can offer a patch to fix it if you think it make sense. Also shared with u...@spark.apache.org -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org