On Javadocs, both new clients (producer and consumer) have very thorough
documentation in the javadocs. 0.9.0.0 will be the first release with the
new consumer.

On deserialization, the new consumer lets you specify deserializers just
like you do for the new producer. But the old consumer supports this as
well -- you can use the createMessageStreams(topicCountMap, keyDecoder,
valueDecoder) form to specify how keys and values should be decoded from
byte[] to a Java object.

On JSON, generally Kafka has only provided very simple
serializers/deserializers that don't introduce any additional dependencies
-- byte[], string, int, etc. Essentially the built in support is only for a
few primitive types. There actually is a JsonSerializer and
JsonDeserializer now in trunk (and will be in 0.9.0.0) because Copycat,
Kafka's new import/export tool, needs to ship with *some* serializer that
can handle complex data. However, I'm not sure it works quite like what you
probably want -- it uses Jackson JsonNodes, whereas I'm guessing you'd want
to be able to pass in any POJO.

The next version of Confluent Platform will ship with a JsonSerializer that
has the behavior I think you're looking for -- see
https://github.com/confluentinc/schema-registry/tree/master/json-serializer/src/main/java/io/confluent/kafka/serializers.
It's also been integrated with Confluent's REST proxy.

-Ewen

On Sun, Oct 11, 2015 at 9:04 AM, Andrew Pennebaker <
andrew.penneba...@gmail.com> wrote:

> Will Kafka v0.9 publish official javadocs for the entire API? In 0.8,
> javadocs appear rather sparse. It's hard to find a javadoc that documents
> both Consumers and Producers.
>
> Also, will future versions of Kafka have more intuitive
> serializer/deserializer interfaces? E.g., if a Producer can configure an
> automatic POJO -> byte[] serializer, why does the Consumer API not have the
> option to configure an automatic byte[] -> POJO deserializer?
>
> Could a basic JSON serializer/deserializer be included? JSON's a common
> enough wire format that I think it's reasonable to include a decent one by
> default, so Kafka users aren't reinventing that wheel.
>
> --
> Cheers,
> Andrew
>



-- 
Thanks,
Ewen

Reply via email to