gopidesupavan commented on code in PR #62867:
URL: https://github.com/apache/airflow/pull/62867#discussion_r2922963732


##########
providers/common/sql/src/airflow/providers/common/sql/datafusion/engine.py:
##########
@@ -158,6 +158,58 @@ def _fetch_extra_configs(keys: list[str]) -> dict[str, 
Any]:
                 credentials = self._remove_none_values(credentials)
                 extra_config = _fetch_extra_configs(["region", "endpoint"])
 
+            case "google_cloud_platform":
+                try:
+                    from airflow.providers.google.common.hooks.base_google 
import get_field as gcp_get_field
+                except ImportError:
+                    from airflow.providers.common.compat.sdk import 
AirflowOptionalProviderFeatureException
+
+                    raise AirflowOptionalProviderFeatureException(
+                        "Failed to import get_field. To use the GCS storage 
functionality, please install the "
+                        "apache-airflow-providers-google package."
+                    )
+                key_path = gcp_get_field(conn.extra_dejson, "key_path")

Review Comment:
   this is not always true.. try to use `GoogleBaseHook` it generates 
credentails writes to temp file and sets to this env var 
`GOOGLE_APPLICATION_CREDENTIALS`  see here `provide_gcp_credential_file` in 
GoogleBaseHook



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to