To be honest, I am not a Nutch guru. If I were you, I would run solrindex
for each segment one by one for now (or write a script to automate that).
Solr will combine results for each segment. I tested that before. You have
to test it by yourself because your system is in production. :-)

Down the road, probably you can skip solrindex, instead just doing the
following command directly as an example:
bin/nutch crawl firstSite/urls -solr http://localhost:8983/solr -depth 3
-topN 50

Have fun.

On Wed, Jul 27, 2011 at 2:50 AM, Marseld Dedgjonaj <
[email protected]> wrote:

> Hello,
> Thanks for your responses.
>
> Way Cool: For now I need the shortest solution because it is in production.
> I will upgrade in nutch-1.3 and Solr 3.3 but will do in another time.
>
> Alxsss: No error in Solr log.
>
> I made some other tests by passing as parameter for segments in solrindex
> not "$crawldir/segments/*" but "$crawldir/segments/segmentName" and it
> works. Maybe I should put only segments created during this crawl cycle.
> If this is the problem, is there a way to put more than one segment as
> parameter.
> For exm: during the crawl are created 3 segments (first, second, third).
> How to pass all three as parameter in solrindex.
>
> Thanks in advance!
>
> Marseldi
>
> -----Original Message-----
> From: [email protected] [mailto:[email protected]]
> Sent: Wednesday, July 27, 2011 12:23 AM
> To: [email protected]
> Subject: Re: solrindex command` not working
>
> check for errors in solr log.
>
>
> -----Original Message-----
> From: Way Cool <[email protected]>
> To: user <[email protected]>
> Sent: Tue, Jul 26, 2011 3:14 pm
> Subject: Re: solrindex command` not working
>
>
> The latest solr version is 3.3. Maybe you can try that.
>
>
> On Tue, Jul 26, 2011 at 2:10 AM, Marseld Dedgjonaj <
> [email protected]> wrote:
>
> > Hello list,
> >
> > I am having trouble using nutch + solr in index step.
> >
> > I use nutch 1.2 and solr 1.3.
> >
> > When I execute command:
> >
> > $NUTCH_HOME/bin/nutch solrindex http://127.0.0.1:8983/solr/main/
> > $crawldir/crawldb $crawldir/linkdb $crawldir/segments/*
> >
> > I got "2011-07-25 20:11:59,702 ERROR solr.SolrIndexer -
> > java.io.IOException:
> > Job failed!" and no index passed to the SOLR.
> >
> >
> >
> > Any Idea what I am doing wrong.
> >
> >
> >
> > Thanks in advance,
> >
> > Marseld
> >
> >
> >
> >
> >
> > <p class="MsoNormal"><span style="color: rgb(31, 73, 125);">Gjeni
> > <b>Pun&euml; t&euml; Mir&euml;</b> dhe <b>t&euml; Mir&euml; p&euml;r
> > Pun&euml;</b>... Vizitoni: <a target="_blank" href="
> http://www.punaime.al/
> > ">www.punaime.al</a></span></p>
> > <p><a target="_blank" href="http://www.punaime.al/";><span
> > style="text-decoration: none;"><img width="165" height="31" border="0"
> > alt="punaime" src="http://www.ikub.al/images/punaime.al_small.png";
> > /></span></a></p>
> >
>
>
>
>
> <p class="MsoNormal"><span style="color: rgb(31, 73, 125);">Gjeni
> <b>Pun&euml; t&euml; Mir&euml;</b> dhe <b>t&euml; Mir&euml; p&euml;r
> Pun&euml;</b>... Vizitoni: <a target="_blank" href="http://www.punaime.al/
> ">www.punaime.al</a></span></p>
> <p><a target="_blank" href="http://www.punaime.al/";><span
> style="text-decoration: none;"><img width="165" height="31" border="0"
> alt="punaime" src="http://www.ikub.al/images/punaime.al_small.png";
> /></span></a></p>
>
>
>

Reply via email to