Kafka Connect is the ability to run a "source" and/or "sink" of streaming
data on the Kafka instance itself. It defines the APIs so people can
include the component (and its configuration) from the command-line. Part
of the Kafka Connect component are Key and Value serialization, and in
Kafka Connect itself, one can apply additional transformations of those
messages.

For PLC4X,

I would imagine that the Source Connector would be configurable with a long
list of data points to be scanned, possibly with some conditions,
priorities and other algorithms to decide what/when to be scanned, at what
interval and which topic is the destination. Of course PLC4X configuration
of protocols would also be needed.

The Sink Connector would read from N topics, and the message would contain
what values to be written to where. It should be enough to do this
on-demand.

I have not figured out how the Source Connector will distributed the work
load in the cluster. One would assume that there is some mechanism in
place, otherwise the ETL (Extract Load Transform) use case would not work,
and ETL seems to be a major case for Kafka Connect.


My conclusion; I think this is a great idea, and that we should aim for
implementing. However, I think it is more important to have more protocols
in place first, so that the PLC4X abstraction is tuned for disparate types
of systems, before attempting to apply another abstraction onto PLC4X API.
And for that, I think we need to increase the number of hands (my own
included) to help out, which requires that the people with the vision and
understanding of the details, need to work more on documentation and less
on exotic integration. I think it is the most urgent issue in this project.


Cheers
Niclas


On Thu, Mar 22, 2018 at 5:58 PM, Niclas Hedhman <nic...@hedhman.org> wrote:

>
> Chris,
> I came across a presentation by coincidence that mentioned "Kafka
> Connect", so I will research this over the weekend and see what it is, and
> how it fits with PLC4X. The presenter only mentioned that a "Kafka Connect"
> component is enable by configuration (in Kafka was implied but not sure)
> rather than programming.
>
> More next week.
> Niclas
>
> On Wed, Mar 21, 2018 at 5:19 PM, Niclas Hedhman <nic...@hedhman.org>
> wrote:
>
>>
>> Alright... So to repeat (and I think you get this much)... Kafka is in
>> principle very simple; Put any byte array into a topic in one end, and
>> readers can get that byte array from that topic somewhere else. Just a
>> simple message queue, at an abstract level. And the Kafka client libraries
>> for doing this is dead easy to use, producer.send() and consumer.poll()
>> respectively.
>>
>> So, I suspect that you are talking about some higher level than base
>> Kafka, possibly masquerading the Kafka transport as some other transport
>> abstraction or a component implementing some protocol, such as MQTT [1],
>> that behind the end point feeds Kafka. But I could be completely off-track
>> and some really clever ideas are at play here, that I don't realize yet.
>> So, yes, please invite people to discuss further.
>>
>> Niclas
>>
>> [1] MQTT feeding Kafka is something I will implement myself in the not
>> too distant future
>>
>> On Wed, Mar 21, 2018 at 4:50 PM, Christofer Dutz <
>> christofer.d...@c-ware.de> wrote:
>>
>>> Hi Niclas,
>>>
>>> have to admit I don't know in detail either ... it was part of a talk a
>>> colleague of mine held at a Kafka meetup in our office. Also I'm not into
>>> Kafka half as deep as you ... from a user point of view it looked as if
>>> such a Kafka Connect adapter would be a good idea. I'm trying to convince
>>> that colleague to contribute ... perhaps he could register here and shed
>>> some light on the topic ... I'll continue the convincing (even if another
>>> codecentric participant wouldn't increase company diversity)
>>>
>>> Chris
>>>
>>>
>>> Am 13.03.18, 16:41 schrieb "hedh...@gmail.com im Auftrag von Niclas
>>> Hedhman" <hedh...@gmail.com im Auftrag von nic...@hedhman.org>:
>>>
>>>     Sorry, don't know what that is or what it really is, compared to
>>> what I
>>>     wrote...
>>>
>>>     On Tue, Mar 13, 2018 at 8:50 PM, Christofer Dutz <
>>> christofer.d...@c-ware.de>
>>>     wrote:
>>>
>>>     > Argh ... I wanted to propose a Kafka Connect and not a Kafka Steam
>>> adapter
>>>     > ( ...
>>>     >
>>>     > By the way my currently most used setup is actually using PLC4X
>>> for PLC
>>>     > communication and using Edgents Kafka Connector to publish to
>>> Kafka.
>>>     >
>>>     > A Kafka Connect adapter would allow to directly connect it to
>>> Kafka.
>>>     >
>>>     > Chris
>>>     >
>>>     >
>>>     > Am 13.03.18, 00:02 schrieb "hedh...@gmail.com im Auftrag von
>>> Niclas
>>>     > Hedhman" <hedh...@gmail.com im Auftrag von nic...@hedhman.org>:
>>>     >
>>>     >     I have some Kafka experience, but not used Kafka Streams. But
>>> I think
>>>     > the
>>>     >     most straight forward approach would still be to have a Kafka
>>> Producer
>>>     > on
>>>     >     the PLC4X side simply writing to one or more Kafka topics,
>>> which is
>>>     >     relatively simple.
>>>     >
>>>     >     You need to 'configure' during the instance creation;
>>>     >
>>>     >     Properties properties = new Properties();
>>>     >     properties.put( ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
>>> cli.kafka );
>>>     >     properties.put( ProducerConfig.LINGER_MS_CONFIG, 1 );
>>>     >     properties.put( ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
>>>     >     StringSerializer.class.getName() );
>>>     >     properties.put( ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
>>>     >     StringSerializer.class.getName() );
>>>     >     KafkaProducer<String, String> producer = new KafkaProducer<>(
>>>     > properties );
>>>     >
>>>     >     And once you have the producer, you can push a message to the
>>> topic
>>>     > with or
>>>     >     without a key, like this;
>>>     >
>>>     >     producer.send( new ProducerRecord<>( "your-topic", value ) );
>>>     >
>>>     >
>>>     >
>>>     >     I am pretty sure Kafka Streams can take over from the topic
>>> queue.
>>>     >
>>>     >
>>>     >     Cheers
>>>     >     Niclas
>>>     >
>>>     >     On Mon, Mar 12, 2018 at 10:38 PM, Christofer Dutz <
>>>     > christofer.d...@c-ware.de
>>>     >     > wrote:
>>>     >
>>>     >     > Hi,
>>>     >     >
>>>     >     > last week I attended the Kafka meetup of a colleague of mine
>>> and in
>>>     > his
>>>     >     > talk he introduced Kafka Streams and to me it sounded like a
>>>     > „Clustered
>>>     >     > Edgent“.
>>>     >     >
>>>     >     > My second thought was: would it not be cool to implement
>>> such a Kafka
>>>     >     > Stream Connector?
>>>     >     >
>>>     >     > Anyone here got the knowhow to do that? Shouldn’t be that
>>> difficult.
>>>     >     >
>>>     >     > Chris
>>>     >     >
>>>     >
>>>     >
>>>     >
>>>     >     --
>>>     >     Niclas Hedhman, Software Developer
>>>     >     http://polygene.apache.org - New Energy for Java
>>>     >
>>>     >
>>>     >
>>>
>>>
>>>     --
>>>     Niclas Hedhman, Software Developer
>>>     http://polygene.apache.org - New Energy for Java
>>>
>>>
>>>
>>
>>
>> --
>> Niclas Hedhman, Software Developer
>> http://polygene.apache.org - New Energy for Java
>>
>
>
>
> --
> Niclas Hedhman, Software Developer
> http://polygene.apache.org - New Energy for Java
>



-- 
Niclas Hedhman, Software Developer
http://polygene.apache.org - New Energy for Java

Reply via email to