Hi Lewis,

Thanks for recommending patch!

Issue is resolved but after a little tweak.

There was a typo in 'SolrDeleteDuplicates.java' class I had to edit "import
org.apache.hadoop*hadoop*.mapreduce.Mapper" this to "import
org.apache.hadoop.mapreduce.Mapper".


Thanks & Regards,
Jamshaid


On Tue, Jun 25, 2013 at 12:13 AM, Lewis John Mcgibbney <
[email protected]> wrote:

> Hi Jamshaid,
> Please see the Jira issue and patch for this.
> https://issues.apache.org/jira/browse/NUTCH-1571
> I would like to commit a patch for 2.x regarding how we were writing bytes,
> if you can test this patch then we can maybe add it as well and push 2.2.1
> Thank you
> Lewis
>
> On Monday, June 24, 2013, Jamshaid Ashraf <[email protected]> wrote:
> > Hi,
> >
> > I'm using Nutch 2.x with HBase.
> >
> > I have been facing following error in console whenever I run solr
> > dedup command:
> >
> > SOLR dedup -> http://localhost:8080/solr/
> > Exception in thread "main" java.lang.NullPointerException
> >  at
> >
>
> org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
> >  at
> >
>
> org.apache.hadoop.mapreduce.split.JobSplitWriter.writeNewSplits(JobSplitWriter.java:123)
> > *Below is the hadoob log:*
> >
> > 2013-06-24 18:26:05,414 INFO  solr.SolrDeleteDuplicates -
> > SolrDeleteDuplicates: starting...
> > 2013-06-24 18:26:05,414 INFO  solr.SolrDeleteDuplicates -
> > SolrDeleteDuplicates: Solr url: http://localhost:8080/solr/
> > 2013-06-24 18:26:05,970 WARN  util.NativeCodeLoader - Unable to load
> > native-hadoop library for your platform... using builtin-java classes
> where
> > applicable
> > 2013-06-24 18:26:05,984 WARN  mapred.JobClient - No job jar file set.
>  User
> > classes may not be found. See JobConf(Class) or JobConf#setJar(String).
> >
> > Could anybody help out to get rid of this propblem?
> >
> > Thanks in advance!
> >
> > Regards,
> > Jamshaid
> >
>
> --
> *Lewis*
>

Reply via email to