GitHub user vanzin opened a pull request:

    https://github.com/apache/spark/pull/11510

    [SPARK-13626] [core] Avoid duplicate config deprecation warnings.

    Three different things were needed to get rid of spurious warnings:
    - silence deprecation warnings when cloning configuration
    - change the way SparkHadoopUtil instantiates SparkConf to silence
      warnings
    - avoid creating new SparkConf instances where it's not needed.
    
    On top of that, I changed the way that Logging.scala detects the repl;
    now it uses a method that is overridden in the repl's Main class, and
    the hack in Utils.scala is not needed anymore. This makes the 2.11 repl
    behave like the 2.10 one and set the default log level to WARN, which
    is a lot better. Previously, this wasn't working because the 2.11 repl
    triggers log initialization earlier than the 2.10 one.
    
    I also removed and simplified some other code in the 2.11 repl's Main
    to avoid replicating logic that already exists elsewhere in Spark.
    
    Last but not least, fixed a compilation bug in a test for Scala 2.10.
    
    Tested the 2.11 repl in local and yarn modes.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/vanzin/spark SPARK-13626

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/11510.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #11510
    
----
commit c5338f6561d62ac4a869012f369df9339b1437cb
Author: Marcelo Vanzin <[email protected]>
Date:   2016-03-04T01:09:17Z

    [SPARK-13626] [core] Avoid duplicate config deprecation warnings.
    
    Three different things were needed to get rid of spurious warnings:
    - silence deprecation warnings when cloning configuration
    - change the way SparkHadoopUtil instantiates SparkConf to silence
      warnings
    - avoid creating new SparkConf instances where it's not needed.
    
    On top of that, I changed the way that Logging.scala detects the repl;
    now it uses a method that is overridden in the repl's Main class, and
    the hack in Utils.scala is not needed anymore. This makes the 2.11 repl
    behave like the 2.10 one and set the default log level to WARN, which
    is a lot better. Previously, this wasn't working because the 2.11 repl
    triggers log initialization earlier than the 2.10 one.
    
    I also removed and simplified some other code in the 2.11 repl's Main
    to avoid replicating logic that already exists elsewhere in Spark.
    
    Last but not least, fixed a compilation bug in a test for Scala 2.10.
    
    Tested the 2.11 repl in local and yarn modes.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to