[ https://issues.apache.org/jira/browse/SPARK-6048?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Patrick Wendell resolved SPARK-6048. ------------------------------------ Resolution: Fixed Fix Version/s: 1.3.0 > SparkConf.translateConfKey should not translate on set > ------------------------------------------------------ > > Key: SPARK-6048 > URL: https://issues.apache.org/jira/browse/SPARK-6048 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.3.0 > Reporter: Andrew Or > Assignee: Andrew Or > Priority: Blocker > Fix For: 1.3.0 > > > There are several issues with translating on set. > (1) The most serious one is that if the user has both the deprecated and the > latest version of the same config set, then the value picked up by SparkConf > will be arbitrary. Why? Because during initialization of the conf we call > `conf.set` on each property in `sys.props` in an order arbitrarily defined by > Java. As a result, the value of the more recent config may be overridden by > that of the deprecated one. Instead, we should always use the value of the > most recent config. > (2) If we translate on set, then we must keep translating everywhere else. In > fact, the current code does not translate on remove, which means the > following won't work if X is deprecated: > {code} > conf.set(X, Y) > conf.remove(X) // X is not in the conf > {code} > This requires us to also translate in remove and other places, as we already > do for contains, leading to more duplicate code. > (3) Since we call `conf.set` on all configs when initializing the conf, we > print all deprecation warnings in the beginning. Elsewhere in Spark, however, > we warn the user when the deprecated config / option / env var is actually > being used. > We should keep this consistent so the user won't expect to find all > deprecation messages in the beginning of his logs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org