Hi Gordeon, Thx for your reply, already implemented ;)

> On 27 Jul 2017, at 12:57, Tzu-Li (Gordon) Tai <tzuli...@apache.org> wrote:
> 
> Hi!
> 
> Yes, you can provide a custom writer for the BucketingSink via 
> BucketingSink#setWriter(…).
> The AvroKeyValueSinkWriter is a simple example of a writer that uses Avro for 
> serialization, and takes as input KV 2-tuples.
> If you want to have a writer that takes as input your own event types, AFAIK 
> you’ll need to implement your own Writer.
> 
> Cheers,
> Gordon
> 
> On 21 July 2017 at 7:31:21 PM, Rinat (r.shari...@cleverdata.ru 
> <mailto:r.shari...@cleverdata.ru>) wrote:
> 
>> Hi, folks !
>> 
>> I’ve got a little question, I’m trying to save stream of events from Kafka 
>> into HDSF using 
>> org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink with AVRO 
>> serialization. 
>> If I properly understood, I should use some implementation of 
>> org.apache.flink.streaming.connectors.fs.Writer<T> for this purposes.
>> 
>> I found an existing implementation of avro writer 
>> org.apache.flink.streaming.connectors.fs.AvroKeyValueSinkWriter<K, V>, but 
>> my stream contains only value. 
>> What I need to do, if I want to write values from stream using a 
>> BucketingSing in avro format ?
>> 
>> Thx.

Reply via email to