Hi Joe,

You may find some of the info in this blog of interest, its based on
streaming pipelines but useful ideas.

https://cloud.google.com/blog/products/gcp/how-to-handle-mutating-json-schemas-in-a-streaming-pipeline-with-square-enix

Cheers

Reza

On Thu, 29 Nov 2018 at 06:53, Joe Cullen <[email protected]> wrote:

> Hi all,
>
> I have a pipeline reading CSV files, performing some transforms, and
> writing to BigQuery. At the moment I'm reading the BigQuery schema from a
> separate JSON file. If the CSV files had a new column added (and I wanted
> to include this column in the resultant BigQuery table), I'd have to change
> the JSON schema or the pipeline itself. Is there any way to autodetect the
> schema using BigQueryIO? How do people normally deal with potential changes
> to input CSVs?
>
> Thanks,
> Joe
>

Reply via email to