[ https://issues.apache.org/jira/browse/SPARK-15344?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15286038#comment-15286038 ]
Felix Cheung commented on SPARK-15344: -------------------------------------- SPARK-14881 was to get pyspark and sparkR shell to match the new default behavior of spark-shell (Scala). As you can see here, it will always set the default to WARN: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/internal/Logging.scala#L135 I agree it makes sense if log4j-defaults.properties is there we should keep log level set there, for all shell/REPL cases. > Unable to set default log level for PySpark > ------------------------------------------- > > Key: SPARK-15344 > URL: https://issues.apache.org/jira/browse/SPARK-15344 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.0.0 > Reporter: Maciej BryĆski > Priority: Minor > > After this patch: > https://github.com/apache/spark/pull/12648 > I'm unable to set default log level for Pyspark. > It's always WARN. > Below setting doesn't work: > {code} > mbrynski@jupyter:~/spark$ cat conf/log4j.properties > # Set everything to be logged to the console > log4j.rootCategory=INFO, console > log4j.appender.console=org.apache.log4j.ConsoleAppender > log4j.appender.console.target=System.err > log4j.appender.console.layout=org.apache.log4j.PatternLayout > log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p > %c{1}: %m%n > # Set the default spark-shell log level to WARN. When running the > spark-shell, the > # log level for this class is used to overwrite the root logger's log level, > so that > # the user can have different defaults for the shell and regular Spark apps. > log4j.logger.org.apache.spark.repl.Main=INFO > # Settings to quiet third party logs that are too verbose > log4j.logger.org.spark_project.jetty=WARN > log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR > log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO > log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO > log4j.logger.org.apache.parquet=ERROR > log4j.logger.parquet=ERROR > # SPARK-9183: Settings to avoid annoying messages when looking up nonexistent > UDFs in SparkSQL with Hive support > log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL > log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org