okayhooni commented on code in PR #37538:
URL: https://github.com/apache/airflow/pull/37538#discussion_r1494380848
##########
airflow/providers/google/cloud/operators/bigquery_dts.py:
##########
@@ -359,7 +369,18 @@ def _wait_for_transfer_to_be_done(self, run_id: str,
transfer_config_id: str, in
if interval <= 0:
raise ValueError("Interval must be > 0")
+ idx = 0
while True:
+ current_tick_div, current_tick_mod = divmod(idx * interval,
self.token_refresh_interval_seconds)
+ next_tick_div, next_tick_mod = divmod((idx + 1) * interval,
self.token_refresh_interval_seconds)
+ if (current_tick_div < next_tick_div and 0 < next_tick_mod) or (
+ current_tick_mod == 0 and 1 <= current_tick_div
+ ):
+ _ = self.hook.refresh_credentials()
+ self.log.info(
+ f"Credentials were refreshed on tick: idx={idx},
idx*interval={idx * interval} sec"
+ )
Review Comment:
Thank you for quick and detailed review..!
Actually, I can't find the token validation method on `base_google.py`
module, as like `databricks_base.py` module you mentioned.. If it has similar
method based on Google sdk, it will be more simple and elegant way, as you
said. (But, it tasks more GCP API calls to check token validity on every loop)
We experienced some error cases, even though the tasks take just over 30
minutes, not 1 hour, with same authorization error. (I don't know why..) so I
decided to use the half of default token lifespan to refresh token.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]