vtonne opened a new issue, #27018:
URL: https://github.com/apache/airflow/issues/27018

   ### Apache Airflow version
   
   Other Airflow 2 version (please specify below)
   
   ### What happened
   
   Running on GCP Composer, version composer-2.0.22-airflow-2.2.5.
   While connecting to an instance in different project, an error has been 
produced with a notion to inform Airflow team.
   Please note, I have been able to connect to a different instance in the 
other project using same DAG file (only changes in fields of instance_name and 
zone). Furthermore, prior to getting an error permissions for service account 
have been modified and Compute Admin role along with Editor role have been 
removed. (returning those permissions didn`t solve the issue)
   
   Error log (sensitive data edited with XXXX, YYYY, ZZZZ):
   *** Reading remote log from 
gs://XXXXXXXXXX-bucket/logs/test_SSH_IP_DATA_v01/task_ssh_ip_data/2022-10-12T19:09:15.095054+00:00/1.log.
   [2022-10-12, 19:09:17 UTC] {taskinstance.py:1044} INFO - Dependencies all 
met for <TaskInstance: test_SSH_IP_DATA_v01.task_ssh_ip_data 
manual__2022-10-12T19:09:15.095054+00:00 [queued]>
   [2022-10-12, 19:09:17 UTC] {taskinstance.py:1044} INFO - Dependencies all 
met for <TaskInstance: test_SSH_IP_DATA_v01.task_ssh_ip_data 
manual__2022-10-12T19:09:15.095054+00:00 [queued]>
   [2022-10-12, 19:09:17 UTC] {taskinstance.py:1250} INFO - 
   
--------------------------------------------------------------------------------
   [2022-10-12, 19:09:17 UTC] {taskinstance.py:1251} INFO - Starting attempt 1 
of 2
   [2022-10-12, 19:09:17 UTC] {taskinstance.py:1252} INFO - 
   
--------------------------------------------------------------------------------
   [2022-10-12, 19:09:17 UTC] {taskinstance.py:1271} INFO - Executing 
<Task(SSHOperator): task_ssh_ip_data> on 2022-10-12 19:09:15.095054+00:00
   [2022-10-12, 19:09:17 UTC] {standard_task_runner.py:52} INFO - Started 
process 1958490 to run task
   [2022-10-12, 19:09:17 UTC] {standard_task_runner.py:79} INFO - Running: 
['airflow', 'tasks', 'run', 'test_SSH_IP_DATA_v01', 'task_ssh_ip_data', 
'manual__2022-10-12T19:09:15.095054+00:00', '--job-id', '44165', '--raw', 
'--subdir', 'DAGS_FOLDER/test_SSH_COMMAND_IP_DATA.py', '--cfg-path', 
'/tmp/tmpc37c99ef', '--error-file', '/tmp/tmpo1dhs336']
   [2022-10-12, 19:09:17 UTC] {standard_task_runner.py:80} INFO - Job 44165: 
Subtask task_ssh_ip_data
   [2022-10-12, 19:09:18 UTC] {task_command.py:298} INFO - Running 
<TaskInstance: test_SSH_IP_DATA_v01.task_ssh_ip_data 
manual__2022-10-12T19:09:15.095054+00:00 [running]> on host 
airflow-worker-XXXXXXXX
   [2022-10-12, 19:09:18 UTC] {taskinstance.py:1745} WARNING - We expected to 
get frame set in local storage but it was not. Please report this as an issue 
with full logs at https://github.com/apache/airflow/issues/new
   Traceback (most recent call last):
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 1341, in _run_raw_task
       self._execute_task_with_callbacks(context)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 1442, in _execute_task_with_callbacks
       self.render_templates(context=context)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 2070, in render_templates
       self.task.render_template_fields(context)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", 
line 1061, in render_template_fields
       self._do_render_template_fields(self, self.template_fields, context, 
jinja_env, set())
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", 
line 1074, in _do_render_template_fields
       rendered_content = self.render_template(content, context, jinja_env, 
seen_oids)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", 
line 1108, in render_template
       template = jinja_env.get_template(content)
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/environment.py", 
line 997, in get_template
       return self._load_template(name, globals)
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/environment.py", 
line 958, in _load_template
       template = self.loader.load(self, name, self.make_globals(globals))
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/loaders.py", line 
125, in load
       source, filename, uptodate = self.get_source(environment, name)
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/loaders.py", line 
214, in get_source
       raise TemplateNotFound(template)
   jinja2.exceptions.TemplateNotFound: cd /mnt/XXXX/XXXXX/XXXXXX/test ; sudo -u 
ZZZZZZZZ ./YYYYYYYYY.sh
   
   During handling of the above exception, another exception occurred:
   
   Traceback (most recent call last):
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 1743, in get_truncated_error_traceback
       execution_frame = _TASK_EXECUTION_FRAME_LOCAL_STORAGE.frame
   AttributeError: '_thread._local' object has no attribute 'frame'
   [2022-10-12, 19:09:18 UTC] {taskinstance.py:1776} ERROR - Task failed with 
exception
   Traceback (most recent call last):
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 1341, in _run_raw_task
       self._execute_task_with_callbacks(context)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 1442, in _execute_task_with_callbacks
       self.render_templates(context=context)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/taskinstance.py", 
line 2070, in render_templates
       self.task.render_template_fields(context)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", 
line 1061, in render_template_fields
       self._do_render_template_fields(self, self.template_fields, context, 
jinja_env, set())
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", 
line 1074, in _do_render_template_fields
       rendered_content = self.render_template(content, context, jinja_env, 
seen_oids)
     File 
"/opt/python3.8/lib/python3.8/site-packages/airflow/models/baseoperator.py", 
line 1108, in render_template
       template = jinja_env.get_template(content)
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/environment.py", 
line 997, in get_template
       return self._load_template(name, globals)
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/environment.py", 
line 958, in _load_template
       template = self.loader.load(self, name, self.make_globals(globals))
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/loaders.py", line 
125, in load
       source, filename, uptodate = self.get_source(environment, name)
     File "/opt/python3.8/lib/python3.8/site-packages/jinja2/loaders.py", line 
214, in get_source
       raise TemplateNotFound(template)
   jinja2.exceptions.TemplateNotFound: cd /mnt/XXXX/XXXXX/XXXXXX/test ; sudo -u 
ZZZZZZZZ ./YYYYYYYYY.sh
   [2022-10-12, 19:09:18 UTC] {taskinstance.py:1279} INFO - Marking task as 
UP_FOR_RETRY. dag_id=test_SSH_IP_DATA_v01, task_id=task_ssh_ip_data, 
execution_date=20221012T190915, start_date=20221012T190917, 
end_date=20221012T190918
   [2022-10-12, 19:09:19 UTC] {standard_task_runner.py:93} ERROR - Failed to 
execute job 44165 for task task_ssh_ip_data (cd /mnt/XXXX/XXXXX/XXXXXX/test ; 
sudo -u ZZZZZZZZ ./YYYYYYYYY.sh; 1958490)
   [2022-10-12, 19:09:19 UTC] {local_task_job.py:154} INFO - Task exited with 
return code 1
   [2022-10-12, 19:09:19 UTC] {local_task_job.py:264} INFO - 0 downstream tasks 
scheduled from follow-on schedule check
   
   
   
   DAG file:
   from airflow import DAG
   from datetime import datetime, timedelta
   from airflow.operators.python import PythonOperator
   # from airflow.contrib.hooks.ssh_hook import SSHHook
   from airflow.contrib.operators.ssh_operator import SSHOperator
   from airflow.providers.google.cloud.hooks.compute_ssh import 
ComputeEngineSSHHook
   from airflow.operators.python import get_current_context
   
   
   default_args = {
       'owner': 'Vlad Tonne',
       'retries': 1,
       'retry_delay': timedelta(minutes=5)
   }
   
   
   with DAG(
       dag_id='test_SSH_IP_DATA_v01',
       description='SSH IP DATA',
       default_args=default_args,
       start_date=datetime(2022, 7, 17),
       schedule_interval=None
   ) as dag:
       run_ssh_ip_data = SSHOperator(
           task_id="task_ssh_ip_data",
           ssh_hook=ComputeEngineSSHHook(
               instance_name="XXXXXXXX",
               zone="us-central1-a",
               project_id="ZZZZZZZ",
               # instance_name="YYYYYYY",
               # zone="us-east1-b",
               # project_id="ZZZZZZZ",            
               use_oslogin=True,
               use_iap_tunnel=False,
               # use_iap_tunel=True,
               use_internal_ip=True
           ),
           command="cd /mnt/XXXX/XXXXX/XXXXXX/test ; sudo -u ZZZZZZZZ 
./YYYYYYYYY.sh",
           # command="ls -lah /mnt/XXXX/XXXXX/XXXXXX/test",
          
       )
   
       run_ssh_ip_data
   
   ### What you think should happen instead
   
   SSHOperator was not able to connect to a GCE instance and produced an error. 
   Furthermore, the error states "WARNING - We expected to get frame set in 
local storage but it was not. Please report this as an issue with full logs at 
https://github.com/apache/airflow/issues/new";
   
   Connecting to a different machine worked. 
   
   ### How to reproduce
   
   1. Have 2 different projects on GCP, with Airflow /Composer on one of them 
and GCE instances on another. 
   2. Configure service account in use by Airflow on the project with GCE 
instances to have roles "Compute OS Admin Login" + "Compute OS Login" . 
   3. Configure GCE instance -> Metadata -> key1:  enable-oslogin : true , 
key2:  enable-oslogin-2fa: FALSE.  
   4. Attempt to connect using SSHOperator. example (full file listed at "What 
happened" section" : 
       run_ssh_ip_data = SSHOperator(
           task_id="task_ssh_ip_data",
           ssh_hook=ComputeEngineSSHHook(
               instance_name="XXXXXXXX",
               zone="us-central1-a",
               project_id="ZZZZZZZ",      
               use_oslogin=True,
               use_iap_tunnel=False,
               use_internal_ip=True
           ),
   
   
   ### Operating System
   
   Ubuntu 20.04.5 LTS
   
   ### Versions of Apache Airflow Providers
   
   composer-2.0.22-airflow-2.2.5
   
   ### Deployment
   
   Composer
   
   ### Deployment details
   
   Environment configuration overrides:
   secrets
   backend
   
airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend
   backend_kwargs
   {"project_id": "XXXXXXXXXX", "connections_prefix":"airflow-connections", 
"variables_prefix":"airflow-variables", "sep":"-"}
   
   ### Anything else
   
   Issue occurs every run of the DAG. 
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to