[ 
https://issues.apache.org/jira/browse/SPARK-13666?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15199293#comment-15199293
 ] 

Rishabh Bhardwaj commented on SPARK-13666:
------------------------------------------

[~vanzin] I am using the master branch and not getting the above warnings.
{code}
 ~/Documents/spark/spark   master  ./bin/spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/03/17 16:13:36 WARN NativeCodeLoader: Unable to load native-hadoop library 
for your platform... using builtin-java classes where applicable
16/03/17 16:13:37 WARN Utils: Service 'SparkUI' could not bind on port 4040. 
Attempting port 4041.
Spark context available as sc (master = local[*], app id = local-1458211417452).
SQL context available as sqlContext.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0-SNAPSHOT
      /_/

Using Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_73)
Type in expressions to have them evaluated.
Type :help for more information.

scala>
{code}

> Annoying warnings from SQLConf in log output
> --------------------------------------------
>
>                 Key: SPARK-13666
>                 URL: https://issues.apache.org/jira/browse/SPARK-13666
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Marcelo Vanzin
>            Priority: Minor
>
> Whenever I run spark-shell I get a bunch of warnings about SQL configuration:
> {noformat}
> 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set 
> non-Spark SQL config in SQLConf: key = spark.yarn.driver.memoryOverhead, 
> value = 26
> 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set 
> non-Spark SQL config in SQLConf: key = spark.yarn.executor.memoryOverhead, 
> value = 26
> 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set 
> non-Spark SQL config in SQLConf: key = spark.executor.cores, value = 1
> 16/03/03 19:00:25 WARN hive.HiveSessionState$$anon$1: Attempt to set 
> non-Spark SQL config in SQLConf: key = spark.executor.memory, value = 
> 268435456
> {noformat}
> That should't happen, since I'm not setting those values explicitly. They're 
> either set internally by Spark or come from spark-defaults.conf.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to