Ah, since you're using an old Nutch and an old SolrJ client and that the 
Javabin format has changed over time, i think your Solr is too new for the 
client. I'd advice to upgrade to Nutch 1.8 if you can.

 
 
-----Original message-----
> From:devang pandey <[email protected]>
> Sent: Monday 8th July 2013 12:13
> To: [email protected]
> Subject: Re: nutch 1.2 solr 3.6 integration issue
> 
> Hey.. This is my hadoop log file
> ava.lang.RuntimeException: Invalid version (expected 2, but 60) or the data
> in not in 'javabin' format
> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:99)
> at
> org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:41)
> at
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:469)
> at
> org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:249)
> at
> org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
> at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:69)
> at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:54)
> at org.apache.nutch.indexer.solr.SolrWriter.close(SolrWriter.java:75)
> at
> org.apache.nutch.indexer.IndexerOutputFormat$1.close(IndexerOutputFormat.java:48)
> at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:474)
> at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
> at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:216)
> 2013-07-08 15:17:39,539 ERROR solr.SolrIndexer - java.io.IOException: Job
> failed!
> 
> 
> 
> On Mon, Jul 8, 2013 at 3:41 PM, Markus Jelsma 
> <[email protected]>wrote:
> 
> > hi
> >
> > you still haven't provided the logs.
> >
> >
> > -----Original message-----
> > > From:devang pandey <[email protected]>
> > > Sent: Monday 8th July 2013 12:08
> > > To: [email protected]
> > > Subject: Re: nutch 1.2 solr 3.6 integration issue
> > >
> > > hey markus .Tried your solution but its still not working . can you pls
> > > suggest me some other way of resolving this issue.
> > >
> > >
> > > On Mon, Jul 8, 2013 at 3:14 PM, Markus Jelsma <
> > [email protected]>wrote:
> > >
> > > > You need to provide the log output. But i think crawl/segments/* is the
> > > > problem. You must either do seg1 seg2 seg3 or -dir segments/. No
> > wildcards
> > > > supported!
> > > >
> > > > Cheers
> > > >
> > > >
> > > >
> > > > -----Original message-----
> > > > > From:devang pandey <[email protected]>
> > > > > Sent: Monday 8th July 2013 11:41
> > > > > To: [email protected]
> > > > > Subject: nutch 1.2 solr 3.6 integration issue
> > > > >
> > > > > I have crawled a site successfully using NUTCH 1.2 .Now I want to
> > > > integrate
> > > > > this with solr 3.6 . Problem is when I am issuing command $ bin/nutch
> > > > > solrindex http://localhost:8080/solr/ crawl/crawldb crawl/linkdb cra
> > > > > wl/segments/* an error occurs
> > > > >
> > > > > SolrIndexer: starting at 2013-07-08 14:52:27 java.io.IOException: Job
> > > > > failed!
> > > > >
> > > > > Please help me to solve this issue
> > > > >
> > > > >
> > > > > Thankyou
> > > > >
> > > >
> > >
> >
> 

Reply via email to