oscerd opened a new issue #16: support camel type converters
URL: https://github.com/apache/camel-kafka-connector/issues/16
 
 
   Provide a way to support camel type converters and/or integrating them in 
kafka connect `key.converter` and `value.converter`.
   
   comment 1 @omarsmak: Hello @valdar , what is the plan for this issue? If no 
one is working on it currently, I can work on it as well
   
   comment 2 @valdar : Sure @omarsmak I will be more busy with #4 in near 
future.
   I would appreciate if you can outline your idea before starting to code, so 
we can be sure we are on the same page; I admit the issue's description is not 
the most detailed one... they were initially made as a self note/brain dump...
   
   comment 3 @valdar : Actually, the first thing to do here is to understand if 
it is probably better to implement a custom Kafka connect transformation: 
https://docs.confluent.io/current/connect/concepts.html#transforms for doing 
this (as opposed to using connector converters 
https://docs.confluent.io/current/connect/concepts.html#converters). @omarsmak
   
   comment 4 @omarsmak Sure, one option I could think of from the top of my 
head is, we could try to utilize Camel type converter registry somehow, we can 
use this to serialize the data if we have the info about the data from -> to 
types. Will need to experiment this though
   
   comment 5 @omarsmak : Actually it depends, the question what do we want to 
achieve here, do we want to convert the data from/to byte before sending it to 
kafka, or do we want just to transform the data? The way I can think about it 
is like this:
   * Kafka Connect Transformation can work with camel type converters
   * Kafka Connect Converter can work with camel data format 
   However let's take an example, let's see we have this S3 component that 
includes a type converter that convert from  `S3ObjectInputStream` to `byte` 
and vice versa, if that is the case, then it would be better suited as Kafka 
Connect converter, isn't? 
   
   comment 6 @omarsmak : @valdar I need to dump my thoughts before I forget :D: 
   Probably you are right, an SMT wrapper will be suitable for this. We can use 
Camel TypeConverter registry that can be obtained via `DefaultCamelContext`. 
What I can purpose is the following:
   * We add a `TypeConverterTransform` SMT class, which is a wrapper around 
Camel TypeConverter registry, since we don't have enough information about the 
types that we need convert from -> to, I suggest the user will need to add this 
into the config, then in this wrapper, we will add the schema info in the 
`Struct` object and convert the value accordingly per the user data type 
configs. 
   * Once we have the above SMT, we can as well use it as a 'Default' converter 
into a Kafka Connect converter wrapper, to be used as Kafka Connect converter, 
before sending the data to Kafka as `byte[]`.
   
   Please feel free to discuss :)  

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to