Hi Kaniska,

We have an ongoing discussion on the Beam developer mailing list on
how exactly we'll implement sinks in Beam. I couldn't find the KafkaIO
source you mentioned. However, for a proof of concept we can quickly
implement something similar.

Please see my branch on GitHub [1] for an example with a Flink Kafka
Producer. I tested it using a local Kafka installation and the
existing KafkaWindowedWordCountExample.

You can easily check out the code from your existing Beam Git repository:

git fetch https://github.com/mxm/incubator-beam kafkaSink
git checkout FETCH_HEAD
mvn clean install -DskipTests

Then please see the modified KafkaWindowedWordCountExample where we
first read from Kafka and then write back to Kafka. In your Maven
project, please use version 0.1.0-incubating-SNAPSHOT.

Best,
Max

[1] https://github.com/mxm/incubator-beam/tree/kafkaSink

On Fri, Apr 22, 2016 at 5:27 AM, kaniska Mandal
<[email protected]> wrote:
>
> I am referring to this post  :
> FromEmanuele Cesena <[email protected]>
> SubjectRe: Output from Beam (on Flink) to Kafka
> DateFri, 18 Mar 2016 16:45:15 GMT
> source :
> https://github.com/ecesena/oscars2016/blob/master/beam-twitter/src/main/java/com/shopkick/data/dataflow/TwitterDataflow.java
>
>>> I added the KafkaIO inside
>>> org.apache.beam.runners.flink.translation.wrappers.streaming.io
>
>>> I have also registered KafkaIOWriteBoundStreamingTranslator with
>>> KafkaIO.Write.Bound.class  inside - FlinkStreamingTransformTranslators
>
> Now I need help to invoke FlinkKafkaProducer08 to send messages from the
> following method ..
>
> public void translateNode(KafkaIO.Write.Bound<T> transform,
> FlinkStreamingTranslationContext context) {...}
>
> It would be great if someone can provide some hints.
>
> Thanks
>
> Kaniska
>
>
>
>

Reply via email to