Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9084#discussion_r41825192
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -509,6 +512,11 @@ class SparkConf(loadDefaults: Boolean) extends 
Cloneable with Logging {
     
     private[spark] object SparkConf extends Logging {
     
    +  // Deprecation message for memory fraction configs used in the old 
memory management model
    +  private val deprecatedMemoryFractionMessage =
    +    "As of Spark 1.6, execution and storage memory management are unified. 
" +
    +      "All memory fractions used in the old model are now deprecated and 
no longer read."
    --- End diff --
    
    The old configurations will still be respected in legacy mode, so this is 
slightly ambiguous / confusing. Is there an easy way to avoid the warning if 
the legacy mode configuration is turned on? If not, I suppose we could just 
expand the deprecation message to mention this corner case, perhaps by just 
appending a "(unless spark.XX.YY is enabled)" at the end.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to