Hi Costin,

Thanks for the tip. I replaced the old version of jackson and it works now 
:).

Cheers
Shankar

On Sunday, June 15, 2014 3:09:27 AM UTC-6, Costin Leau wrote:
>
> What version of MapR are you using? MapR uses an old version of jackson 
> which es-hadoop should detect and use an 
> appropriate code path. 
> There are various fixes: 
>
> 1. I've pushed a fix on the 2.x branch which improves detection - you can 
> try the 2.0.1.BUILD-SNAPSHOT version here [a] 
> 2. You can upgrade the jackson version in MapR to version 1.7 or higher 
> (vanilla Hadoop uses 1.8.8). This approach works 
> with the current 
> es hadoop and also gives you a performance boost for serializing data. 
>
> Cheers, 
>
> [a] 
> https://github.com/elasticsearch/elasticsearch-hadoop#development-snapshot 
>
> On 6/13/14 11:30 PM, [email protected] <javascript:> wrote: 
> > Hi , 
> > 
> > I am trying to integrate elasticsearch with a mapr hadoop cluster. I am 
> using the hive-elasticsearch integration 
> > document. I am able to read data from the elasticsearch node. However I 
> am not able to write data into the elasticsearch 
> > node which is my primary requirement. Request to kindly guide me . 
> > 
> > I always get the following errors:- 
> > 
> > 2014-06-13 14:15:45,814 INFO 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: New Final Path: FS 
> maprfs:/user/hive/warehouse/dev.db/_tmp.shankar/000002_0 
> > *2014-06-13 14:15:45,947 FATAL 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper: java.lang.NoSuchMethodError: 
> org.codehaus.jackson.JsonGenerator.writeUTF8String([BII)V 
> >         at 
> org.elasticsearch.hadoop.serializ*ation.json.JacksonJsonGenerator.writeUTF8String(JacksonJsonGenerator.java:123)
>  
>
> >         at 
> org.elasticsearch.hadoop.mr.WritableValueWriter.write(WritableValueWriter.java:47)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveWritableValueWriter.write(HiveWritableValueWriter.java:83)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveWritableValueWriter.write(HiveWritableValueWriter.java:38)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:69) 
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:111) 
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:55) 
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:41) 
>
> >         at 
> org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:258)
>  
>
> >         at 
> org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:92)
>  
>
> >         at 
> org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:79)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.EsSerDe.serialize(EsSerDe.java:128) 
> >         at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:582)
>  
>
> >         at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 
> >         at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:87)
>  
>
> >         at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 
> >         at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:92)
>  
>
> >         at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 
> >         at 
> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:540) 
> >         at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:177) 
> >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) 
> >         at 
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417) 
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:348) 
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:282) 
> >         at java.security.AccessController.doPrivileged(Native Method) 
> >         at javax.security.auth.Subject.doAs(Subject.java:415) 
> >         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1117)
>  
>
> >         at org.apache.hadoop.mapred.Child.main(Child.java:271) 
> > 
> > 2014-06-13 14:15:45,947 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 
> 3 finished. closing... 
> > 2014-06-13 14:15:45,947 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 
> DESERIALIZE_ERRORS:0 
> > 2014-06-13 14:15:45,947 INFO 
> org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 finished. closing... 
> > 2014-06-13 14:15:45,947 INFO 
> org.apache.hadoop.hive.ql.exec.SelectOperator: 1 finished. closing... 
> > 2014-06-13 14:15:45,947 INFO 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: 2 finished. closing... 
> > 2014-06-13 14:15:45,948 INFO 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator: 2 Close done 
> > 2014-06-13 14:15:45,948 INFO 
> org.apache.hadoop.hive.ql.exec.SelectOperator: 1 Close done 
> > 2014-06-13 14:15:45,948 INFO 
> org.apache.hadoop.hive.ql.exec.TableScanOperator: 0 Close done 
> > 2014-06-13 14:15:45,948 INFO org.apache.hadoop.hive.ql.exec.MapOperator: 
> 3 Close done 
> > 2014-06-13 14:15:45,948 INFO 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper: ExecMapper: processed 0 rows: 
> used memory = 9514320 
> > 2014-06-13 14:15:45,992 INFO org.apache.hadoop.mapred.TaskLogsTruncater: 
> Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1 
> > 2014-06-13 14:15:46,024 WARN org.apache.hadoop.mapred.Child: Error 
> running child 
> > java.lang.RuntimeException: java.lang.NoSuchMethodError: 
> org.codehaus.jackson.JsonGenerator.writeUTF8String([BII)V 
> >         at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:195) 
> >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) 
> >         at 
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417) 
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:348) 
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:282) 
> >         at java.security.AccessController.doPrivileged(Native Method) 
> >         at javax.security.auth.Subject.doAs(Subject.java:415) 
> >         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1117)
>  
>
> >         at org.apache.hadoop.mapred.Child.main(Child.java:271) 
> > Caused by: java.lang.NoSuchMethodError: 
> org.codehaus.jackson.JsonGenerator.writeUTF8String([BII)V 
> >         at 
> org.elasticsearch.hadoop.serialization.json.JacksonJsonGenerator.writeUTF8String(JacksonJsonGenerator.java:123)
>  
>
> >         at 
> org.elasticsearch.hadoop.mr.WritableValueWriter.write(WritableValueWriter.java:47)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveWritableValueWriter.write(HiveWritableValueWriter.java:83)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveWritableValueWriter.write(HiveWritableValueWriter.java:38)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:69) 
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:111) 
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:55) 
>
> >         at 
> org.elasticsearch.hadoop.hive.HiveValueWriter.write(HiveValueWriter.java:41) 
>
> >         at 
> org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:258)
>  
>
> >         at 
> org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:92)
>  
>
> >         at 
> org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:79)
>  
>
> >         at 
> org.elasticsearch.hadoop.hive.EsSerDe.serialize(EsSerDe.java:128) 
> >         at 
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:582)
>  
>
> >         at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 
> >         at 
> org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:87)
>  
>
> >         at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 
> >         at 
> org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:92)
>  
>
> >         at 
> org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:793) 
> >         at 
> org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:540) 
> >         at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:177) 
> >         ... 8 more 
> > 
> > -- 
> > You received this message because you are subscribed to the Google 
> Groups "elasticsearch" group. 
> > To unsubscribe from this group and stop receiving emails from it, send 
> an email to 
> > [email protected] <javascript:> <mailto:
> [email protected] <javascript:>>. 
> > To view this discussion on the web visit 
> > 
> https://groups.google.com/d/msgid/elasticsearch/0a194097-183f-48e5-bb63-a0805214f5e8%40googlegroups.com
>  
> > <
> https://groups.google.com/d/msgid/elasticsearch/0a194097-183f-48e5-bb63-a0805214f5e8%40googlegroups.com?utm_medium=email&utm_source=footer>.
>  
>
> > For more options, visit https://groups.google.com/d/optout. 
>
> -- 
> Costin 
>

-- 
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/271a6321-e348-499e-9696-8b2cd29227ba%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to