Hi Luis,
Right...
I managed all my Spark "things" through Maven, bu that I mean I have a pom.xml
with all the dependencies in it. Here it is:
http://maven.apache.org/POM/4.0.0;
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance;
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
Hey Anupam,
Thanks... but no:
I tried:
SparkConf conf = new SparkConf().setAppName("my
app").setMaster("local");
JavaSparkContext javaSparkContext = new JavaSparkContext(conf);
javaSparkContext.setLogLevel("WARN");
SQLContext
Hi Jean,
How about using sc.setLogLevel("WARN") ? You may add this statement after
initializing the Spark Context.
>From the Spark API - "Valid log levels include: ALL, DEBUG, ERROR, FATAL,
INFO, OFF, TRACE, WARN". Here's the link in the Spark API.
Thanks Mich, but what is SPARK_HOME when you run everything through Maven?
> On Jul 4, 2016, at 5:12 PM, Mich Talebzadeh wrote:
>
> check %SPARK_HOME/conf
>
> copy file log4j.properties.template to log4j.properties
>
> edit log4j.properties and set the log levels to
check %SPARK_HOME/conf
copy file log4j.properties.template to log4j.properties
edit log4j.properties and set the log levels to your needs
cat log4j.properties
# Set everything to be logged to the console
log4j.rootCategory=ERROR, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
Hi,
I have installed Apache Spark via Maven.
How can I control the volume of log it displays on my system? I tried different
location for a log4j.properties, but none seems to work for me.
Thanks for help...
-
To unsubscribe