I have some Kafka experience, but not used Kafka Streams. But I think the
most straight forward approach would still be to have a Kafka Producer on
the PLC4X side simply writing to one or more Kafka topics, which is
relatively simple.

You need to 'configure' during the instance creation;

Properties properties = new Properties();
properties.put( ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, cli.kafka );
properties.put( ProducerConfig.LINGER_MS_CONFIG, 1 );
properties.put( ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class.getName() );
properties.put( ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
StringSerializer.class.getName() );
KafkaProducer<String, String> producer = new KafkaProducer<>( properties );

And once you have the producer, you can push a message to the topic with or
without a key, like this;

producer.send( new ProducerRecord<>( "your-topic", value ) );



I am pretty sure Kafka Streams can take over from the topic queue.


Cheers
Niclas

On Mon, Mar 12, 2018 at 10:38 PM, Christofer Dutz <christofer.d...@c-ware.de
> wrote:

> Hi,
>
> last week I attended the Kafka meetup of a colleague of mine and in his
> talk he introduced Kafka Streams and to me it sounded like a „Clustered
> Edgent“.
>
> My second thought was: would it not be cool to implement such a Kafka
> Stream Connector?
>
> Anyone here got the knowhow to do that? Shouldn’t be that difficult.
>
> Chris
>



-- 
Niclas Hedhman, Software Developer
http://polygene.apache.org - New Energy for Java

Reply via email to