As possible workarounds, you could
a) Implement your own serialization by implementing the "Value" interface.
b) Use the Hadoop
http://hadoop.apache.org/docs/r2.3.0/api/org/apache/hadoop/io/MapWritable.html
MapWritable Class. You have to use hadoop's LongWritable and IntWritable
for the types of the map but Flink should be able to handle Writable's in
POJOs.

I would recommend option b).

On Fri, Dec 5, 2014 at 10:22 AM, Stephan Ewen <se...@apache.org> wrote:

> We are in the midst of replacing Avro in the serialization, so this should
> spoon be fixed properly.
>
> Until then, you could try to re-package the collection. Something like an
> array of map entries, or so. Would that feasible?
>
> Stephan
> Am 04.12.2014 21:42 schrieb "Aljoscha Krettek" <aljos...@apache.org>:
>
> > I don't know any workaround. But maybe Avro should be avoided
> > altogether for your requirements.
> >
> > What is the data that you want to move between operations?
> >
> > On Thu, Dec 4, 2014 at 7:13 PM, Paris Carbone <par...@kth.se> wrote:
> > > Hello,
> > >
> > > It seems that Avro fails to serialise POJOs that contain non-String or
> > stringable keys<
> >
> https://apache.googlesource.com/avro/+/40650540dcb8ca8a6b6235de5cdd36c0f6e2eb31/lang/java/avro/src/main/java/org/apache/avro/reflect/ReflectData.java#361
> > >.
> > > eg. in the example here<
> >
> https://github.com/senorcarbone/incubator-flink/blob/72b6798b50396c962fc6cea20a2bcdd51eec06f4/flink-examples/flink-java-examples/src/main/java/org/apache/flink/examples/java/testing/LongMapKeyIssueExample.java
> >
> > I get a compiler exception caused by:
> > >
> > > org.apache.avro.AvroTypeException: Map key class not String: class
> > java.lang.Long
> > >
> > > Is there any known workaround/recommendation for this except for using
> > String keys?
> > > I need this for a use case of low latency data-intensive streaming so
> > String conversions should be avoided.
> > >
> > > Paris
> > >
> > >
> >
>

Reply via email to