Yes, the job continues without any problems. I just saw these stack traces scattered all around in the logs, which i now realise are debug statements and not error.
Thanks for your input. Regards, Chris On 10/12/07, Dennis Kubes <[EMAIL PROTECTED]> wrote: > > If the job is continuing despite this error then this is probably a > config non-error error generated when the configuration is initialized. > It is actually in the configuration code to print out a stack trace, > but it doesn't really throw an error. > > Dennis Kubes > > chris sleeman wrote: > > Hi, > > > > I keep getting the following errors when i inject urls into the crawldb > - > > > > 2007-10-11 20:36:12,406 DEBUG conf.Configuration - java.io.IOException: > > config(config) > > at org.apache.hadoop.conf.Configuration.<init>(Configuration.java:102) > > at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:77) > > at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:88) > > at org.apache.nutch.util.NutchJob.<init>(NutchJob.java:27) > > at org.apache.nutch.crawl.Injector.inject(Injector.java:152) > > at org.apache.nutch.crawl.Injector.run(Injector.java:192) > > at org.apache.hadoop.util.ToolBase.doMain(ToolBase.java:189) > > at org.apache.nutch.crawl.Injector.main(Injector.java:182) > > > > > > Can anyone please tell me what exactly am I missing here? > > > > Regards, > > Chris > > >
