Rittycheriah opened a new issue, #37684:
URL: https://github.com/apache/airflow/issues/37684

   ### Description
   
   Hi, 
   Recently, there was a community dev discussion about how to snapshot tables 
for longer than 7 days in BigQuery. One solution that worked well for me in a 
previous context was leveraging the BigQueryToGCSOperator logic, but instead of 
employing it per table, the top level DAG submitted a dataset name and then I 
could loop over all the available tables and export to GCS as a backup. Would 
it be helpful to others for me to build an open source version of this logic? 
Just a thought! Thank y'all.
   
   ### Use case/motivation
   
   I'd like to contribute to the Airflow project by making a per BigQuery 
dataset to GCS folder operator a reality. If folks are interested in 
snapshotting tables for backups, this could pose an easier way to do it than 
directly in BigQuery itself.
   
   ### Related issues
   
   None than I'm aware of!
   
   ### Are you willing to submit a PR?
   
   - [X] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to