Hi!

It should be quite straightforward to write an "OutputFormat" that wraps
the "FlinkKafkaProducer".

That way you can write to Kafka from a DataSet program.

Stephan



On Fri, Mar 11, 2016 at 1:46 PM, Prez Cannady <revp...@correlatesystems.com>
wrote:

> This is roughly the solution I have now.  On the other hand, I was hoping
> for a solution that doesn’t involve checking whether a file has updated.
>
> Prez Cannady
> p: 617 500 3378
> e: revp...@opencorrelate.org
> GH: https://github.com/opencorrelate
> LI: https://www.linkedin.com/in/revprez
>
>
>
>
>
>
>
>
>
> On Mar 11, 2016, at 12:20 AM, Balaji Rajagopalan <
> balaji.rajagopa...@olacabs.com> wrote:
>
> You could I suppose write the dateset to a sink a file and then read the
> file to a data stream.
>
> On Fri, Mar 11, 2016 at 4:18 AM, Prez Cannady <revp...@opencorrelate.org>
> wrote:
>
>>
>> I’d like to pour some data I’ve collected into a DataSet via JDBC into a
>> Kafka topic, but I think I need to transform my DataSet into a DataStream
>> first.  If anyone has a clue how to proceed, I’d appreciate it; or let me
>> know if I’m completely off track.
>>
>>
>> Prez Cannady
>> p: 617 500 3378
>> e: revp...@opencorrelate.org
>> GH: https://github.com/opencorrelate
>> LI: https://www.linkedin.com/in/revprez
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>
>

Reply via email to