My application will emit log files in avro json encoding so that humans
easily can read and grep records.
I need to transfer this logs into Kafka as Avro binary encoding.
And I want to use confluent schema registry in order to prepare schema
After some research, I think Kafka connect in standalone mode might be the
The avro-converter included in kafka connect interacts with schema registry
under the hood.
But I am not sure avro-converter accepts Avro json encoding as input.
Do I need to create a json-encoding version of the avro-converter?
And I don't find a documentation of Converter interface.
I think converting job is just byte to byte job, but the interface
seems to require internal data object.
Any advise or guidance would be greatly appreciated!