[ https://issues.apache.org/jira/browse/SPARK-14881?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15284454#comment-15284454 ]
Maciej BryĆski commented on SPARK-14881: ---------------------------------------- [~felixcheung] Could you check this ? https://issues.apache.org/jira/browse/SPARK-15344 > pyspark and sparkR shell default log level should match spark-shell/Scala > ------------------------------------------------------------------------- > > Key: SPARK-14881 > URL: https://issues.apache.org/jira/browse/SPARK-14881 > Project: Spark > Issue Type: Bug > Components: PySpark, Spark Shell, SparkR > Affects Versions: 2.0.0 > Reporter: Felix Cheung > Assignee: Felix Cheung > Priority: Minor > Fix For: 2.0.0 > > > Scala spark-shell defaults to log level WARN. pyspark and sparkR should match > that by default (user can change it later) > # ./bin/spark-shell > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org