When running hadoop in deploy mode the actual tasks are ran by the
mapreduce framework so you have to check the mapreduce "user" logs. Either
use the jobtracker interface or check them directly on the nodes in
HADOOP_HOME/logs/userlogs or something like that.

On Fri, May 11, 2012 at 1:11 PM, Vijith <[email protected]> wrote:

> I have tried with a seperate logger and a printWriter objects to do this.
> It works in local mode but not in deploy mode.
> I am running the nutch job file. Its running and generating the hadoop log
> without any errors. But the files are not created in any of the nodes.
>
> On Fri, May 11, 2012 at 3:07 PM, Vijith <[email protected]> wrote:
>
> > Hi,
> >
> > How can I create a separate project specific log in addition to the
> > existing log.
> > I am running nutch in eploy mode.
> > Also I want some urls filtered by my urlfilter to be stored in an
> external
> > flat file. How can I achieve this.
> >
> > --
> > *Thanks & Regards*
> > *
> > *
> > *Vijith V*
> >
> >
> >
>
>
> --
> *Thanks & Regards*
> *
> *
> *Vijith V*
>

Reply via email to