This has been fixed a few days ago. Update your 1.3 export.

> On Mon, Apr 18, 2011 at 10:39 AM, Klaus Tachtler <[email protected]> wrote:
> > Hi Gabriele,
> > 
> > i had the same problem a few days ago, the answer was to delete date
> > 'data' directory inside your solr installation. Under my installation it
> > was /var/www/solr/data.
> 
> that didn't do the trick for me. =(
> 
> For the logs, I found this in hadoop.log:
> 
> 2011-04-18 10:44:26,390 WARN  mapred.LocalJobRunner - job_local_0001
> java.lang.IllegalAccessError: tried to access field
> org.slf4j.impl.StaticLoggerBinder.SINGLETON from class
> org.slf4j.LoggerFactory
>     at org.slf4j.LoggerFactory.staticInitialize(LoggerFactory.java:83)
>     at org.slf4j.LoggerFactory.<clinit>(LoggerFactory.java:73)
>     at
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.<clinit>(CommonsHtt
> pSolrServer.java:78) at
> org.apache.nutch.indexer.solr.SolrWriter.open(SolrWriter.java:44) at
> org.apache.nutch.indexer.IndexerOutputFormat.getRecordWriter(IndexerOutputF
> ormat.java:42) at
> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:433)
>     at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
>     at
> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:216)
> 2011-04-18 10:44:26,928 ERROR solr.SolrIndexer - java.io.IOException: Job
> failed!
> 
> > Then do it again --> $ bin/nutch solrindex
> > http://localhost:8080/solrcrawl/crawldb/0
> > 
> >  I'm now having the same problem but I'm not finding the problem yet.
> >  
> >> $ bin/nutch solrindex http://localhost:8080/solr crawl/crawldb/0
> >> crawl/linkdb crawl/segments/0/20110418100309
> >> SolrIndexer: starting at 2011-04-18 10:03:40
> >> java.io.IOException: Job failed!
> > 
> > Grüße
> > Klaus.
> > 
> > --
> > 
> > ------------------------------------------------
> > e-Mail  : [email protected]
> > Homepage: http://www.tachtler.net
> > DokuWiki: http://www.dokuwiki.tachtler.net
> > ------------------------------------------------

Reply via email to