If you're using the spark-ec2 scripts, you may have to change
/root/ephemeral-hdfs/conf/log4j.properties or something like that, as that
is added to the classpath before Spark's own conf.


On Wed, Jun 25, 2014 at 6:10 PM, Tobias Pfeiffer <t...@preferred.jp> wrote:

> I have a log4j.xml in src/main/resources with
>
> <?xml version="1.0" encoding="UTF-8" ?>
> <!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
> <log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/";>
>     [...]
>     <root>
>         <priority value ="warn" />
>         <appender-ref ref="Console" />
>     </root>
> </log4j:configuration>
>
> and that is included in the jar I package with `sbt assembly`. That
> works fine for me, at least on the driver.
>
> Tobias
>
> On Wed, Jun 25, 2014 at 2:25 PM, Philip Limbeck <philiplimb...@gmail.com>
> wrote:
> > Hi!
> >
> > According to
> >
> https://spark.apache.org/docs/0.9.0/configuration.html#configuring-logging
> ,
> > changing log-level is just a matter of creating a log4j.properties
> (which is
> > in the classpath of spark) and changing log level there for the root
> logger.
> > I did this steps on every node in the cluster (master and worker nodes).
> > However, after restart there is still no debug output as desired, but
> only
> > the default info log level.
>

Reply via email to