Hi Jean,

How about using sc.setLogLevel("WARN") ? You may add this statement after
initializing the Spark Context.

>From the Spark API - "Valid log levels include: ALL, DEBUG, ERROR, FATAL,
INFO, OFF, TRACE, WARN". Here's the link in the Spark API.
http://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.SparkContext

Hope this helps,
Anupam




On Mon, Jul 4, 2016 at 2:18 PM, Jean Georges Perrin <j...@jgp.net> wrote:

> Thanks Mich, but what is SPARK_HOME when you run everything through Maven?
>
> On Jul 4, 2016, at 5:12 PM, Mich Talebzadeh <mich.talebza...@gmail.com>
> wrote:
>
> check %SPARK_HOME/conf
>
> copy file log4j.properties.template to log4j.properties
>
> edit log4j.properties and set the log levels to your needs
>
> cat log4j.properties
>
> # Set everything to be logged to the console
> log4j.rootCategory=ERROR, console
> log4j.appender.console=org.apache.log4j.ConsoleAppender
> log4j.appender.console.target=System.err
> log4j.appender.console.layout=org.apache.log4j.PatternLayout
> log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
> %c{1}: %m%n
> # Settings to quiet third party logs that are too verbose
> log4j.logger.org.spark-project.jetty=WARN
> log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
> log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
> log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
> log4j.logger.org.apache.parquet=ERROR
> log4j.logger.parquet=ERROR
> # SPARK-9183: Settings to avoid annoying messages when looking up
> nonexistent UDFs in SparkSQL with Hive support
> log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
> log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
>
> HTH
>
>
>
> Dr Mich Talebzadeh
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
> http://talebzadehmich.wordpress.com
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 4 July 2016 at 21:56, Jean Georges Perrin <j...@jgp.net> wrote:
>
>> Hi,
>>
>> I have installed Apache Spark via Maven.
>>
>> How can I control the volume of log it displays on my system? I tried
>> different location for a log4j.properties, but none seems to work for me.
>>
>> Thanks for help...
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>
>

Reply via email to