Hi, I think that's the code change (by Marcelo Vanzin) that has changed how logging works as of now which seems not to load conf/log4j.properties by default.
Can anyone explain how it's supposed to work in 2.3? I could not figure it out from the code and conf/log4j.properties is not picked up (but it was at least 2 days ago) :( I'm using the master at https://github.com/apache/spark/commit/fba9cc8466dccdcd1f6f372ea7962e7ae9e09be1. Pozdrawiam, Jacek Laskowski ---- https://about.me/JacekLaskowski Spark Structured Streaming (Apache Spark 2.2+) https://bit.ly/spark-structured-streaming Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark Follow me at https://twitter.com/jaceklaskowski ---------- Forwarded message ---------- From: Jacek Laskowski (JIRA) <j...@apache.org> Date: Wed, Aug 30, 2017 at 9:03 AM Subject: [jira] [Commented] (SPARK-21728) Allow SparkSubmit to use logging To: iss...@spark.apache.org [ https://issues.apache.org/jira/browse/SPARK-21728?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16146780#comment-16146780 ] Jacek Laskowski commented on SPARK-21728: ----------------------------------------- I think the change is user-visible and therefore deserves to be included in the release notes for 2.3 (I remember a component or label to mark changes like that in a special way) /cc [~sowen] [~hyukjin.kwon] > Allow SparkSubmit to use logging > -------------------------------- > > Key: SPARK-21728 > URL: https://issues.apache.org/jira/browse/SPARK-21728 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.3.0 > Reporter: Marcelo Vanzin > Assignee: Marcelo Vanzin > Priority: Minor > Fix For: 2.3.0 > > > Currently, code in {{SparkSubmit}} cannot call classes or methods that > initialize the Spark {{Logging}} framework. That is because at that time > {{SparkSubmit}} doesn't yet know which application will run, and logging is > initialized differently for certain special applications (notably, the > shells). > It would be better if either {{SparkSubmit}} did logging initialization > earlier based on the application to be run, or did it in a way that could be > overridden later when the app initializes. > Without this, there are currently a few parts of {{SparkSubmit}} that > duplicates code from other parts of Spark just to avoid logging. For example: > * > [downloadFiles|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L860] > replicates code from Utils.scala > * > [createTempDir|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/DependencyUtils.scala#L54] > replicates code from Utils.scala and installs its own shutdown hook > * a few parts of the code could use {{SparkConf}} but can't right now because > of the logging issue. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org