Having multiple types shouldn't be an issue - ES is a document store so it's 
pretty common to have different types.
In other words, this is not the intended behavior - can you please create a 
small sample/snippet that reproduces the error
and raise an issue for it [1] ?

Thanks!

[1] 
http://www.elasticsearch.org/guide/en/elasticsearch/hadoop/master/troubleshooting.html

On 12/15/14 6:03 PM, Kamil Dziublinski wrote:
Hi,

I had only one jar on classpath and none in hadoop cluster.
I had different types of values in my MapWritable tho. It turns out this was 
the problem.
So I had always Text as a key, but depending on type Text, LongWritable, 
BooleanWritable or DoubleWritable as value in
that map.
When I changed everything to be Text it started working.

Is this intended behaviour?

Cheers,
Kamil.

On Friday, December 12, 2014 8:37:03 PM UTC+1, Costin Leau wrote:

    Hi,

    This error is typically tied to a classpath issue - make sure you have only 
one elasticsearch-hadoop jar version in
    your
    classpath and on the Hadoop cluster.

    On 12/12/14 5:56 PM, Kamil Dziublinski wrote:
    > Hi guys,
    >
    > I am trying to run a MR job that reads from HDFS and stores into 
ElasticSearch cluster.
    >
    > I am getting following error:
    > Error: 
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException: Cannot 
handle type [class
    > org.apache.hadoop.io.MapWritable], instance 
[org.apache.hadoop.io.MapWritable@3879429f] using writer
    > [org.elasticsearch.hadoop.mr.WritableValueWriter@3fc8f1a2]
    >          at 
org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:259)
    >          at 
org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:68)
    >          at 
org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:55)
    >          at 
org.elasticsearch.hadoop.rest.RestRepository.writeToIndex(RestRepository.java:130)
    >          at 
org.elasticsearch.hadoop.mr.EsOutputFormat$EsRecordWriter.write(EsOutputFormat.java:159)
    >          at 
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:635)
    >          at 
org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
    >          at 
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
    >          at 
com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:35)
    >          at 
com.teradata.cybershot.mr.es.userprofile.EsOnlineProfileMapper.map(EsOnlineProfileMapper.java:20)
    >          at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    >          at 
org.apache.hadoop.mapreduce.lib.input.DelegatingMapper.run(DelegatingMapper.java:55)
    >          at 
org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    >          at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
    >          at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
    >          at java.security.AccessController.doPrivileged(Native Method)
    >          at javax.security.auth.Subject.doAs(Subject.java:415)
    >          at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
    >          at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
    >
    > We are using cdh5.1.0 and es-hadoop dependency 2.0.2
    >
    > I have this set in my job configuration:
    > job.setOutputFormatClass(EsOutputFormat.class);
    > job.setMapOutputValueClass(MapWritable.class);
    >
    > together with nodes and resource props like it is described on ES page.
    >
    > in my mapper I simply write: context.write(NullWritable.get(), esMap); 
where esMap is org.apache.hadoop.io.MapWritable.
    >
    > I do not know why it's failing as everything looks ok to me. Maybe you 
will have some ideas.
    >
    > Thanks in advance,
    > Kamil.
    >
    > --
    > You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
    > To unsubscribe from this group and stop receiving emails from it, send an 
email to
    >[email protected] <javascript:> 
<mailto:[email protected] <javascript:>>.
    > To view this discussion on the web visit
    
>https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com
    
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com>
    > 
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com?utm_medium=email&utm_source=footer
    
<https://groups.google.com/d/msgid/elasticsearch/71c57e2a-2210-47c0-aa9e-cbbf164ef05b%40googlegroups.com?utm_medium=email&utm_source=footer>>.

    > For more options, visithttps://groups.google.com/d/optout 
<https://groups.google.com/d/optout>.

    --
    Costin

--
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to
[email protected] 
<mailto:[email protected]>.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/47200ca2-efd7-4741-832d-89c8b9ec088f%40googlegroups.com
<https://groups.google.com/d/msgid/elasticsearch/47200ca2-efd7-4741-832d-89c8b9ec088f%40googlegroups.com?utm_medium=email&utm_source=footer>.
For more options, visit https://groups.google.com/d/optout.

--
Costin

--
You received this message because you are subscribed to the Google Groups 
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/elasticsearch/549019F9.7040100%40gmail.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to