abhishekshenoy opened a new issue #13454:
URL: https://github.com/apache/airflow/issues/13454


   <!--
   
   Welcome to Apache Airflow!  For a smooth issue process, try to answer the 
following questions.
   Don't worry if they're not all applicable; just try to include what you can 
:-)
   
   If you need to include code snippets or logs, please put them in fenced code
   blocks.  If they're super-long, please use the details tag like
   <details><summary>super-long log</summary> lots of stuff </details>
   
   Please delete these comment blocks before submitting the issue.
   
   -->
   
   <!--
   
   IMPORTANT!!!
   
   PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
   NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
   
   PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
   
   Please complete the next sections or the issue will be closed.
   These questions are the first thing we need to know to understand the 
context.
   
   -->
   
   **Apache Airflow version**: 2.0
   
   
   **Kubernetes version (if you are using kubernetes)** (use `kubectl 
version`):v1.18.2
   
   **Environment**:
   
   - **Cloud provider or hardware configuration**: GCP
   - **OS** (e.g. from /etc/os-release): Container Optimized OS
   - **Kernel** (e.g. `uname -a`):
   - **Install tools**:
   - **Others**:
   
   **What happened**:
   We are making use of Dataproc Operator in GCP to Create, Scale and Delete a 
cluster as well as Submit a Job.
   The GCP project names in our setup for different environments have the 
environment name  suffixed.
   
   eg) 
   - Prod Environment : **tech-mod-prod**
   - Stage Environment: **tech-mod-stg**
   - Dev Environment: **tech-mod-dev**
   
   The Dataproc operators need the ProjectId and Region information in order to 
submit to the correct project.
   
   In each of our airflow environments we have created a **env** variable whose 
value is the specific environment and as part of our DAG's python code we 
assign the Project ID as below.
   
   **PROJECT_ID="tech-mod-{{ var.value.env }}"**
   
   Which is then used in our operators as below : 
   
   ```
   create_cluster = DataprocCreateClusterOperator(
       dag=dag_dataproc,
       task_id="create_cluster",
       project_id=PROJECT_ID,
       :
       :
       impersonation_chain=IMPERSONATE_SERVICE_ACCOUNT
   )
   
   run_spark_job = DataprocSubmitJobOperator(
       dag=dag_dataproc,
       task_id="raw_file_extractor_job",
       project_id=PROJECT_ID,
       :
       :
       impersonation_chain=IMPERSONATE_SERVICE_ACCOUNT
   )
   
   stop_cluster = DataprocDeleteClusterOperator(
       dag=dag_dataproc,
       task_id='stop_cluster',
       project_id=PROJECT_ID,
       :
       :
       impersonation_chain=IMPERSONATE_SERVICE_ACCOUNT
   )
   ``` 
   
   We are able to successfully Create Cluster as well as Submit a Job .
   
   But the Delete Cluster operator fails with 
   ```
   [2020-12-30 10:32:27,731] {secret_manager_client.py:89} ERROR - Google Cloud 
API Call Error (PermissionDenied): No access for Secret ID 
airflow-variables-env.
                   Did you add \'secretmanager.versions.access\' permission?
   [2020-12-30 10:32:27,992] {taskinstance.py:1230} INFO - Exporting the 
following env vars:
   [email protected]
   AIRFLOW_CTX_DAG_OWNER=airflow
   AIRFLOW_CTX_DAG_ID=df_raw_file_extraction_workflow
   AIRFLOW_CTX_TASK_ID=stop_cluster
   AIRFLOW_CTX_EXECUTION_DATE=2020-12-29T07:00:00+00:00
   AIRFLOW_CTX_DAG_RUN_ID=scheduled__2020-12-29T07:00:00+00:00
   [2020-12-30 10:32:28,011] {secret_manager_client.py:86} ERROR - Google Cloud 
API Call Error (NotFound): Secret ID airflow-connections-google_cloud_default 
not found.
   [2020-12-30 10:32:28,087] {dataproc.py:840} INFO - Deleting cluster: 
cluster-raw-file-extractor
   [2020-12-30 10:32:28,087] {credentials_provider.py:300} INFO - Getting 
connection using `google.auth.default()` since no key file is defined for hook.
   [2020-12-30 10:32:28,483] {taskinstance.py:1396} ERROR - 403 Request is 
prohibited by organization\'s policy. vpcServiceControlsUniqueIdentifier: 
fbebd1237123a5c3
   Traceback (most recent call last):
     File "/home/airflow/.local/lib/python3.8/site-packages/google/api_core/    
.py", line 57, in error_remapped_callable
       return callable_(*args, **kwargs)
     File "/home/airflow/.local/lib/python3.8/site-packages/grpc/_channel.py", 
line 923, in __call__    
       return _end_unary_response_blocking(state, call, False, None)
     File "/home/airflow/.local/lib/python3.8/site-packages/grpc/_channel.py", 
line 826, in _end_unary_response_blocking
       raise _InactiveRpcError(state)
   grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated 
with:
   \tstatus = StatusCode.PERMISSION_DENIED
   \tdetails = "Request is prohibited by organization\'s policy. 
vpcServiceControlsUniqueIdentifier: fbebd1237123a5c3"
   \tdebug_error_string = 
"{"created":"@1609324348.482151965","description":"Error received from peer 
ipv4:172.217.9.170:443","file":"src/core/lib/surface/call.cc","file_line":1061,"grpc_message":"Request
 is prohibited by organization\'s policy. vpcServiceControlsUniqueIdentifier: 
fbebd1237123a5c3","grpc_status":7}"
   ```
   
   <!-- (please include exact error messages if you can) -->
   
   **What you expected to happen**:
   DataprocDeleteClusterOperator should work seemlessly as serviceAccount setup 
to access gcp services within airflow is all correct , if not CreateCluster 
would not have worked. (Pod which executes the task has service account with 
Dataproc Editor Role)
   
   **What do you think went wrong?** :
   We then removed templates and just passed dev environment parameters and 
were able to Create Cluster + Submit Job + Delete Cluster.
   
   We then finally understood the reason and that is the 
DataprocDeleteClusterOperator is not able to decipher templated fields for 
ProjectId and Region hence unable to understand which project id to delete a 
cluster from and thereby resulting in the issue.
   
   On checking the dataproc.py class python docs we see that : 
https://github.com/apache/airflow/blob/v2-0-stable/airflow/providers/google/cloud/operators/dataproc.py
   
   **Templated fields for DataprocCreateClusterOperator :**  
       template_fields = ('project_id', 'region', 'cluster_config', 
'cluster_name', 'labels', 'impersonation_chain')
       template_fields_renderers = {'cluster_config': 'json'}
   
   **Templated fields for DataprocSubmitJobOperator :**  
       template_fields = ('project_id', 'location', 'job', 
'impersonation_chain')
       template_fields_renderers = {"job": "json"}
   
   **Templated fields for DataprocDeleteClusterOperator :**  
       template_fields = ('impersonation_chain',)
   
   **How to reproduce it**:
   Use templated fields for ProjectId or Region in 
DataprocDeleteClusterOperator.
   <!---
   
   As minimally and precisely as possible. Keep in mind we do not have access 
to your cluster or dags.
   
   If you are using kubernetes, please attempt to recreate the issue using 
minikube or kind.
   
   ## Install minikube/kind
   
   - Minikube https://minikube.sigs.k8s.io/docs/start/
   - Kind https://kind.sigs.k8s.io/docs/user/quick-start/
   
   If this is a UI bug, please provide a screenshot of the bug or a link to a 
youtube video of the bug in action
   
   You can include images using the .md style of
   ![alt text](http://url/to/img.png)
   
   To record a screencast, mac users can use QuickTime and then create an 
unlisted youtube video with the resulting .mov file.
   
   --->
   
   
   **Anything else we need to know**:
   
   <!--
   
   How often does this problem occur? Once? Every time etc?
   
   Any relevant logs to include? Put them here in side a detail tag:
   <details><summary>x.log</summary> lots of stuff </details>
   
   -->
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to