Hi Hitoshi
Looks like you have read
http://spark.apache.org/docs/latest/configuration.html#configuring-logging
On my ec2 cluster I need to also do the following. I think my notes are not
complete. I think you may also need to restart your cluster
Hope this helps
Andy
#
# setting up logger so logging goes to file, makes demo easier to understand
#
ssh -i $KEY_FILE root@$MASTER
cp /home/ec2-user/log4j.properties /root/spark/conf/
for i in `cat /root/spark/conf/slaves`; do scp
/home/ec2-user/log4j.properties root@$i:/home/ec2-user/log4j.properties;
done
#
# restart spark
#
/root/spark/sbin/stop-all.sh
/root/spark/sbin/start-all.sh
From: Hitoshi
Date: Tuesday, November 10, 2015 at 1:22 PM
To: "user @spark"
Subject: Re: How to configure logging...
> I don't have akka but with just Spark, I just edited log4j.properties to
> "log4j.rootCategory=ERROR, console" and ran the following command and was
> able to get only the Time row as output.
>
> run-example org.apache.spark.examples.streaming.JavaNetworkWordCount
> localhost
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-logging-t
> p25346p25348.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>