Hello,

I currently have a working Nutch 1.0 installation that crawls our website and 
then dumps the data into a Solr instance that we have.  I decided to setup a 
1.1rc3 installation and copied the relevant configuration over from the working 
1.0 installation.  It indexes fine but when I try to convert it to Solr it 
silently fails and I find the following in my hadoop.log:

[...]
2010-05-14 16:34:17,580 INFO  solr.SolrMappingReader - source: url dest: url
2010-05-14 16:34:19,555 WARN  mapred.LocalJobRunner - job_local_0001
org.apache.solr.common.SolrException: ERROR: multiple values encountered for 
non multiValued copy field id: http://www.mydomain.com/

ERROR: multiple values encountered for non multiValued copy field id: 
http://www.mydomain.com/

I invoke it by a script like:

bin/nutch crawl urls -dir crawl -depth 10 -topN 10000
bin/nutch solrindex $URL crawl/crawldb crawl/linkdb crawl/segments/*

Any clues on to what might have changed that broke this?

Eric

Reply via email to