Hello Dev Community,

I have a use case where I need to stream data to Google BigQuery.
Therefore, I'm producing data to a Kafka Topic and using Confluent Kafka
BigQuery Sink Connector to sink data from Kafka Topic to BigQuery Table.
Currently, I'm using this method and it works well for our use cases.
However, the challenge we are facing now is that our Avro schema is
changing frequently and we are not aware of the schema beforehand. We want
the data to be streamed seamlessly from Kafka to BigQuery via the
connector.
Unfortunately, we cannot use Nested records since it would not serve our
purpose. Nevertheless, since the connector auto-updates the BigQuery table
schema, we are not worried about that.
The challenge we are facing now is how to produce the records to the Kafka
topic without knowing the schema beforehand. We have tried producing
GenericRecord in Kafka Topic, but it was not successful. We don't want to
manually updated the schema everytime there is a change.
Please let me know if there is any other approach that is well-suited for
our use case.

Thanks,
Shivam

-- 
The information in this mail and any attachment(s) to this message is/are 
confidential and intended solely for the addressee or organisation to whom 
it is addressed. If you have erroneously received this message, please 
notify mailad...@acko.com <mailto:mailad...@acko.com> immediately and 
destroy the message and attachment(s). If you are not the intended 
recipient, any copying, forwarding, altering or disclosing the contents of 
this email message may be unlawful.  The information, attachment(s) or the 
opinions expressed in this mail are those of the individual sender and not 
necessarily those of ACKO.  ACKO accepts no responsibility for any loss or 
damage arising from the use of this email message or its attachment(s).

Reply via email to