The invocation Exception means that something further down is the problem.
It looks to be the presence of your URLNormalizer. Make sure the
configuration is all fine, make sure that the resources are available.
This is not a problem with Nutch code, rather how you are using Nutch in
your own code.

Lewis

On Wed, Mar 6, 2013 at 3:18 PM, imehesz <[email protected]> wrote:

> hello,
>
> I originally attempted to create a Java class, that meant to be executed
> from the command line.
> So, I created a `main` function, and from Eclipse, I was able to just
> /right-click/ and use the `run as java application` menu. Nutch was working
> fine, crawling the URLs as it meant to be.
>
> Later I ran into some issues while trying to generate my executable /JAR/,
> so I decided to just change my class to be an HttpServlet instead, and
> triggered the crawl via an HTTP request. Unfortunately, it dies with the
> famous: `Job failed!` message.
>
> I kinda stepped through the motions, and noticed, when I execute the whole
> thing from Eclipse, my *Crawl.main(args)* properly calls the *Crawl.main()*
> function, but when I call the same function from Http, it ends up jumping
> to
> *Crawl.run()* - not sure if that is normal or not.
>
> You can see the whole console log here:
> http://pastebin.com/raw.php?i=Z7NHYzUU
>
> thanks,
> --imre
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/Nutch-1-6-from-Java-via-HttpServlet-tp4045382.html
> Sent from the Nutch - User mailing list archive at Nabble.com.
>



-- 
*Lewis*

Reply via email to