Hi,

I try to adapt Mark Miller's solr-map-reduce-example scripts in order to
try to use MapReduceIndexerTool with Solr 5.0.0 and Hadoop 2.6.0.
I use the same twitter sample data with the same avro configuration, ...

I had to change the set-map-reduce-classpath.sh file provided with Solr 5
under server/scripts/map-reduce/, because it still uses the old path
"$solr_distrib/example/" instead of "$solr_distrib/server/"

The variables HADOOP_CLASSPATH and HADOOP_LIBJAR are correct, but when I
launch the command

$HADOOP_HOME/bin/hadoop
--config $HADOOP_CONF_DIR
jar $SOLR_HOME/dist/solr-map-reduce-*.jar
--libjars "$HADOOP_LIBJAR"
-D 'mapred.child.java.opts=-Xmx500m'
--morphline-file $ROOT_DIR/solr-map-reduce-example/readAvroContainer.conf
--zk-host 127.0.0.1:2181
--output-dir hdfs://127.0.0.1:9000/outdir
--collection $collection
--log4j $ROOT_DIR/solr-map-reduce-example/log4j.properties
--go-live
--verbose "hdfs://127.0.0.1:9000/indir"

I have the error :

1227 [main] INFO  org.apache.hadoop.mapreduce.JobSubmitter  - Cleaning up
the staging area
file:/tmp/hadoop-bejean/mapred/staging/bejean267256503/.staging/job_local267256503_0001
Exception in thread "main" java.io.FileNotFoundException: File does not
exist:
hdfs://localhost:9000/opt/solr-hadoop/solr/dist/solr-analysis-extras-5.0.0.jar


The MapReduce job try to find the jar file in HDFS
"hdfs://localhost:9000/opt/solr-hadoop/solr/dist/..."and not in the local
file system "/opt/solr-hadoop/solr/dist/".

May be I forgot a step and I had to push jar files into HDFS ?

Thank you for your help

Dominique

Reply via email to